Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
13 result(s) for "Smart pig farming"
Sort by:
Weakly supervised learning through box annotations for pig instance segmentation
Pig instance segmentation is a critical component of smart pig farming, serving as the basis for advanced applications such as health monitoring and weight estimation. However, existing methods typically rely on large volumes of precisely labeled mask data, which are both difficult and costly to obtain, thereby limiting their scalability in real-world farming environments. To address this challenge, this paper proposes a novel approach that leverages simpler box annotations as supervisory information to train a pig instance segmentation network. In contrast to traditional methods, which depend on expensive mask annotations, our approach adopts a weakly supervised learning paradigm that reduces annotation cost. Specifically, we enhance the loss function of an existing weakly supervised instance segmentation model to better align with the requirements of pig instance segmentation. We conduct extensive experiments to compare the performance of the proposed method that only uses box annotations, with that of five fully supervised models requiring mask annotations and two weakly supervised baselines. Experimental results demonstrate that our method outperforms all existing weakly supervised approaches and three out of five fully supervised models. Moreover, compared with fully supervised methods, our approach exhibits only a 3% performance gap in mask prediction. Given that annotating a box takes merely 26 seconds, whereas annotating a mask requires 94 seconds, this minor accuracy trade-off is practically negligible. These findings highlight the value of employing box annotations for pig instance segmentation, offering a more cost-effective and scalable alternative without compromising performance. Our work not only advances the field of pig instance segmentation but also provides a viable pathway to deploy smart farming technologies in resource-limited settings, thereby contributing to more efficient and sustainable agricultural practices.
An Internet of Things Platform Based on Microservices and Cloud Paradigms for Livestock
With the growing adoption of the Internet of Things (IoT) technology in the agricultural sector, smart devices are becoming more prevalent. The availability of new, timely, and precise data offers a great opportunity to develop advanced analytical models. Therefore, the platform used to deliver new developments to the final user is a key enabler for adopting IoT technology. This work presents a generic design of a software platform based on the cloud and implemented using microservices to facilitate the use of predictive or prescriptive analytics under different IoT scenarios. Several technologies are combined to comply with the essential features—scalability, portability, interoperability, and usability—that the platform must consider to assist decision-making in agricultural 4.0 contexts. The platform is prepared to integrate new sensor devices, perform data operations, integrate several data sources, transfer complex statistical model developments seamlessly, and provide a user-friendly graphical interface. The proposed software architecture is implemented with open-source technologies and validated in a smart farming scenario. The growth of a batch of pigs at the fattening stage is estimated from the data provided by a level sensor installed in the silo that stores the feed from which the animals are fed. With this application, we demonstrate how farmers can monitor the weight distribution and receive alarms when high deviations happen.
Transformation toward precision large-scale operations for sustainable farming: A review based on China’s pig industry
This review evaluates the current situation of pig farming, identifies challenges, and projects for the sustainable development of the Chinese pig industry. A literature review using keyword searches was conducted on Google Scholar for articles from 2017–2023. The review included studies focused on pig farming in China, covering prospects, challenges, quantitative data on pro-duction, marketing, and consumption, automation in livestock farming, and publications from peer-reviewed journals, credible websites, government reports, and conference proceedings. Pork consumption in China is increasing, and the country imports a sizable amount of pork annually. Even though small-scale farms still account for most operations, the pig industry is undergoing a critical stage of modernization and transition towards large-scale farming. The major challenges identified were feed, disease, antimicrobial resistance, environmental pollution, and pork prices. Smart technologies, such as cameras, Internet of Things, and sensors, integrated into precision pig farming can improve productivity and animal health through real-time data collection and decision-making. To solve the problems we face now, we need to put a lot of money into large-scale transformation, the creation of new animal precision tools, the automation of manure treatment, and the research and development of long-lasting alternative energy sources like photovoltaics and wind. By implementing these strategies, large-scale precision pig farming in China can become economically and environmentally sustainable, which can ultimately benefit consumers by supplying wholesome pork products.
PigFRIS: A Three-Stage Pipeline for Fence Occlusion Segmentation, GAN-Based Pig Face Inpainting, and Efficient Pig Face Recognition
Accurate animal face recognition is essential for effective health monitoring, behavior analysis, and productivity management in smart farming. However, environmental obstructions and animal behaviors complicate identification tasks. In pig farming, fences and frequent movements often occlude essential facial features, while high inter-class similarity makes distinguishing individuals even more challenging. To address these issues, we introduce the Pig Face Recognition and Inpainting System (PigFRIS). This integrated framework enhances recognition accuracy by removing occlusions and restoring missing facial features. PigFRIS employs state-of-the-art occlusion detection with the YOLOv11 segmentation model, a GAN-based inpainting reconstruction module using AOT-GAN, and a lightweight recognition module tailored for pig face classification. In doing so, our system detects occlusions, reconstructs obscured regions, and emphasizes key facial features, thereby improving overall performance. Experimental results validate the effectiveness of PigFRIS. For instance, YOLO11l achieves a recall of 94.92% and a AP50 of 96.28% for occlusion detection, AOTGAN records a FID of 51.48 and an SSIM of 91.50% for image restoration, and EfficientNet-B2 attains an accuracy of 91.62% with an F1 Score of 91.44% in classification. Additionally, heatmap analysis reveals that the system successfully focuses on relevant facial features rather than irrelevant occlusions, enhancing classification reliability. This work offers a novel and practical solution for animal face recognition in smart farming. It overcomes the limitations of existing methods and contributes to more effective livestock management and advancements in agricultural technology.
DCNN for Pig Vocalization and Non-Vocalization Classification: Evaluate Model Robustness with New Data
Since pig vocalization is an important indicator of monitoring pig conditions, pig vocalization detection and recognition using deep learning play a crucial role in the management and welfare of modern pig livestock farming. However, collecting pig sound data for deep learning model training takes time and effort. Acknowledging the challenges of collecting pig sound data for model training, this study introduces a deep convolutional neural network (DCNN) architecture for pig vocalization and non-vocalization classification with a real pig farm dataset. Various audio feature extraction methods were evaluated individually to compare the performance differences, including Mel-frequency cepstral coefficients (MFCC), Mel-spectrogram, Chroma, and Tonnetz. This study proposes a novel feature extraction method called Mixed-MMCT to improve the classification accuracy by integrating MFCC, Mel-spectrogram, Chroma, and Tonnetz features. These feature extraction methods were applied to extract relevant features from the pig sound dataset for input into a deep learning network. For the experiment, three datasets were collected from three actual pig farms: Nias, Gimje, and Jeongeup. Each dataset consists of 4000 WAV files (2000 pig vocalization and 2000 pig non-vocalization) with a duration of three seconds. Various audio data augmentation techniques are utilized in the training set to improve the model performance and generalization, including pitch-shifting, time-shifting, time-stretching, and background-noising. In this study, the performance of the predictive deep learning model was assessed using the k-fold cross-validation (k = 5) technique on each dataset. By conducting rigorous experiments, Mixed-MMCT showed superior accuracy on Nias, Gimje, and Jeongeup, with rates of 99.50%, 99.56%, and 99.67%, respectively. Robustness experiments were performed to prove the effectiveness of the model by using two farm datasets as a training set and a farm as a testing set. The average performance of the Mixed-MMCT in terms of accuracy, precision, recall, and F1-score reached rates of 95.67%, 96.25%, 95.68%, and 95.96%, respectively. All results demonstrate that the proposed Mixed-MMCT feature extraction method outperforms other methods regarding pig vocalization and non-vocalization classification in real pig livestock farming.
Data recording and use of data tools for pig health management: perspectives of stakeholders in pig farming
Data-driven strategies might combat the spreading of infectious pig disease and improve the early detection of potential pig health problems. The current study aimed to explore individual views on data recording and use of data tools for pig health management by recruiting stakeholders (  = 202) in Spain, Ireland, and the Netherlands. Questionnaire focused on current on-farm challenges, current status of data recording on farms, and evaluation of the two mock data tools. Particularly, \"benchmarking tool\" was designed to visualize individual farm's pig mortality, targeting the management of infectious respiratory and gastrointestinal diseases; and \"early-warning tool\" was designed to generate an alarm through monitoring coughs in pigs, targeting the management of infectious respiratory diseases. Results showed that respiratory and gastrointestinal diseases and aggressive behaviors were the most frequently mentioned health challenge and welfare challenge, respectively. Most of the data was more frequently recorded electronically than on paper. In general, the \"benchmarking tool\" was perceived as useful for the management of infectious respiratory and gastrointestinal diseases, and the \"early-warning tool\" was evaluated as useful for the management of infectious respiratory diseases. Several barriers to the perceived usefulness of these two tools were identified, such as the lack of contextual information, inconvenience of data input, limited internet access, reliance on one's own experience and observation, technical hurdles, and mistrust of information output. The perceived usefulness of both tools was higher among highly educated participants, and those who reported being integrators and positive toward technology for disease control. Female participants and those who came from integrated farms evaluated the \"early-warning tool\" as more useful compared to their counterparts. The perceived usefulness of the \"early-warning tool\" was negatively affected by age and work experience, but positively affected by extensiveness of data recording, positive attitude toward technology, and the current use of technology. In summary, participants showed optimistic views on the use of data tools to support their decision-making and management of infectious pig respiratory and gastrointestinal diseases. It is noteworthy that data tools should not only convey the value of data for informed decision-making but also consider stakeholders' preconditions and needs for data tools.
Behavioral Monitoring Tool for Pig Farmers: Ear Tag Sensors, Machine Intelligence, and Technology Adoption Roadmap
Precision swine production can benefit from autonomous, noninvasive, and affordable devices that conduct frequent checks on the well-being status of pigs. Here, we present a remote monitoring tool for the objective measurement of some behavioral indicators that may help in assessing the health and welfare status—namely, posture, gait, vocalization, and external temperature. The multiparameter electronic sensor board is characterized by laboratory measurements and by animal tests. Relevant behavioral health indicators are discussed for implementing machine learning algorithms and decision support tools to detect animal lameness, lethargy, pain, injury, and distress. The roadmap for technology adoption is also discussed, along with challenges and the path forward. The presented technology can potentially lead to efficient management of farm animals, targeted focus on sick animals, medical cost savings, and less use of antibiotics.
DigiPig: First Developments of an Automated Monitoring System for Body, Head and Tail Detection in Intensive Pig Farming
The goal of this study was to develop an automated monitoring system for the detection of pigs’ bodies, heads and tails. The aim in the first part of the study was to recognize individual pigs (in lying and standing positions) in groups and their body parts (head/ears, and tail) by using machine learning algorithms (feature pyramid network). In the second part of the study, the goal was to improve the detection of tail posture (tail straight and curled) during activity (standing/moving around) by the use of neural network analysis (YOLOv4). Our dataset (n = 583 images, 7579 pig posture) was annotated in Labelbox from 2D video recordings of groups (n = 12–15) of weaned pigs. The model recognized each individual pig’s body with a precision of 96% related to threshold intersection over union (IoU), whilst the precision for tails was 77% and for heads this was 66%, thereby already achieving human-level precision. The precision of pig detection in groups was the highest, while head and tail detection precision were lower. As the first study was relatively time-consuming, in the second part of the study, we performed a YOLOv4 neural network analysis using 30 annotated images of our dataset for detecting straight and curled tails. With this model, we were able to recognize tail postures with a high level of precision (90%).
Validation of a Swine Cough Monitoring System Under Field Conditions
Precision livestock farming technologies support health monitoring on farms, yet few studies have evaluated their effectiveness under field conditions using reliable gold standards. This study evaluated a commercially available technology for detecting cough sounds in pigs on a commercial farm. Audio was recorded over six days using 16 microphones across two pig barns. A total of 1110 cough sounds were labelled by an on-site observer using a cough induction methodology, and 8938 other sounds from farm recordings and open-source datasets (ESC-50, UrbanSound8K, and AudioSet) were labelled. A hybrid deep learning model combining Convolutional Neural Networks and Recurrent Neural Networks was trained and evaluated using these labels. A total of 34 audio features were extracted from 1 s segments, including validated descriptors (e.g., MFCC), unverified external features, and proprietary features. Features were evaluated through 10-fold cross-validation based on classification performance and runtime, resulting in eight final features. The final model showed high performance (recall = 98.6%, specificity = 99.7%, precision = 98.8%, accuracy = 99.6%, F1-score = 98.6%). The technology tested was shown to be efficient for monitoring cough sounds in a commercial swine production facility. It is recommended to test the technology in other environments to evaluate the effectiveness in different farm settings.
Characterising responses in group-housed pigs to Salmonella typhimurium infection through integrated computer vision–based behavioural monitoring and statistical analyses
Background Health monitoring is crucial for early disease detection and prompt intervention to mitigate the disease. Computer vision is one of the novel methods for disease detection, but a significant gap remains in its application for detecting behavioural deviations associated with disease. This study employed YOLOv8s-based behavioural monitoring combined with statistical analysis to evaluate disease detection efficacy in group-housed pigs. Two groups of pigs (Control [CON] and Treatment [TRT]), 9–10 weeks old of a (Large White × Landrace) × Duroc cross, were raised for 21 days. The growing period was divided into three periods (adaptation, challenge, and recovery) and evaluated based on growth performance, health indicators (ear base temperature and faecal score), and behaviour (postures, feeding, and drinking). The TRT group was challenged with Salmonella typhimurium during the challenge period to induce infection, then treated with antibiotics. Two pre-trained YOLOv8s models were employed to quantify postures (Lateral Lying, Sternal Lying, Standing, and Sitting) and nutritive behaviours (Feeding and Drinking). Z-score analyses based on daily data (DZA) and time-specific or 12-h interval (TSZA) data were used to detect behavioural anomalies, with the adaptation period as the baseline. Results During the challenge period, TRT pigs exhibited a drastic decline in growth, increased ear base temperature, and elevated faecal scores, confirming successful infection. Compensatory growth was observed during the recovery period. Automated behaviour monitoring enabled detailed temporal analysis of responses to infection, treatment, and environmental fluctuations. Notable behavioural deviations in the TRT group emerged at 4 days post-inoculation (DPI), aligning with significant health deterioration. However, health indicators diverged as early as 1 DPI, suggesting that group-based behavioural monitoring may be less sensitive to early individual responses. TSZA detected subtle behavioural anomalies earlier than DZA, with disruptions in the TRT group beginning at 0 DPI. These included sharp fluctuations in sitting, lying, and feeding behaviours, which gradually stabilised after treatment. Conclusions This study highlights the potential of computer vision-based behavioural monitoring as a non-invasive, high-throughput tool for real-time health surveillance. While effective for group assessments, results emphasise the need for more advanced methods to enhance early disease detection and improve precision in pig health management.