Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
30,463 result(s) for "Animals Identification."
Sort by:
Longer is Not Always Better
New techniques for the species-level sorting of millions of specimens are needed in order to accelerate species discovery,determine howmany species live on earth, and develop efficient biomonitoring techniques. These sorting methods should be reliable, scalable, and cost-effective, as well as being largely insensitive to low-quality genomic DNA, given that this is usually all that can be obtained from museum specimens. Mini-barcodes seem to satisfy these criteria, but it is unclear how well they perform for species-level sorting when compared with full-length barcodes. This is here tested based on 20 empirical data sets covering ca. 30,000 specimens (5500 species) and six clade-specific data sets from GenBank covering ca. 98,000 specimens (20,000 species). All specimens in these data sets had full-length barcodes and had been sorted to species-level based on morphology. Mini-barcodes of different lengths and positions were obtained in silico from full-length barcodes using a sliding window approach (three windows: 100 bp, 200 bp, and 300 bp) and by excising nine mini-barcodes with established primers (length: 94–407 bp).We then tested whether barcode length and/or position reduces species-level congruence between morphospecies and molecular operational taxonomic units (mOTUs) that were obtained using three different species delimitation techniques (Poisson Tree Process,Automatic Barcode Gap Discovery, and Objective Clustering). Surprisingly,we find no significant differences in performance for both species- or specimen-level identification between full-length and mini-barcodes as long as they are of moderate length (200 bp). Only very short mini-barcodes (200 bp) perform poorly, especially when they are located near the 5 end of the Folmer region. The mean congruence between morphospecies and mOTUs was ca. 75% for barcodes >200 bp and the congruent mOTUs contain ca. 75% of all specimens. Most conflict is caused by ca. 10% of the specimens that can be identified and should be targeted for reexamination in order to efficiently resolve conflict. Our study suggests that large-scale species discovery, identification, and metabarcoding can utilize mini-barcodes without any demonstrable loss of information compared to full-length barcodes.
Birds and animals of Australia's Top End : Darwin, Kakadu, Katherine, and Kununurra
One of the most amazing and accessible wildlife-watching destinations on earth, the Top End of Australia's Northern Territory is home to incredible birds and animals--from gaudy Red-collared Lorikeets to sinister Estuarine Crocodiles and raucous Black Flying-foxes. With this lavishly illustrated photographic field guide, you will be able to identify the most common creatures and learn about their fascinating biology--from how Agile Wallaby mothers can pause their pregnancies to why Giant Frogs spend half the year buried underground in waterproof cocoons. The Top End stretches from the tropical city of Darwin in the north, to the savannas of Mataranka in the south, and southwest across the vast Victoria River escarpments to the Western Australian border. The region includes some of Australia's most popular and impressive tourist destinations, such as Kakadu, Litchfield, Nitmiluk, and Gregory national parks, and is visited by more than two hundred thousand tourists every year. An essential field guide for anyone visiting the Top End, this book will vastly enhance your appreciation of the region's remarkable wildlife. Features hundreds of stunning color photographs Includes concise information on identification and preferred habitat for each species Provides a summary of each species' life history, including interesting habits, and suggestions on where to see it Offers valuable tips on searching for wildlife in the Top End An essential guide for visitors to the Top End, from Darwin south to Katherine and Kununurra, including Kakadu, Litchfield, Nitmiluk and Gregory national parks-- Source other than the Library of Congress.
Detection of Cattle Using Drones and Convolutional Neural Networks
Multirotor drones have been one of the most important technological advances of the last decade. Their mechanics are simple compared to other types of drones and their possibilities in flight are greater. For example, they can take-off vertically. Their capabilities have therefore brought progress to many professional activities. Moreover, advances in computing and telecommunications have also broadened the range of activities in which drones may be used. Currently, artificial intelligence and information analysis are the main areas of research in the field of computing. The case study presented in this article employed artificial intelligence techniques in the analysis of information captured by drones. More specifically, the camera installed in the drone took images which were later analyzed using Convolutional Neural Networks (CNNs) to identify the objects captured in the images. In this research, a CNN was trained to detect cattle, however the same training process could be followed to develop a CNN for the detection of any other object. This article describes the design of the platform for real-time analysis of information and its performance in the detection of cattle.
Postnatal piglet husbandry practices and well-being: The effects of alternative techniques delivered separately
The aim of this study was to evaluate stress responses evoked by 2 alternative methods for performing the following processing procedures: 1) teeth resection--clipping vs. grinding; 2) tail docking--cold vs. hot clipping; 3) identification--ear notch vs. tag; 4) iron administration--injection vs. oral; 5) castration--cords cut vs. torn. Eight to 10 litters of 8-, 2-, and 3-d-old piglets were assigned to each procedure. Within each litter, 2 piglets were assigned to 1 of 4 possible procedures: the 2 alternative methods, a sham procedure, and a sham procedure plus blood sampling. Blood was sampled before processing and at 45 min, 4 h, 48 h, 1 wk, and 2 wk postprocedure and assayed for cortisol and β-endorphin. Procedures were videotaped and analyzed to evaluate the time taken to perform the procedure and the number of squeals, grunts, and escape attempts. Vocalizations were analyzed to determine mean and peak frequencies and duration. Piglets were weighed before the procedure and at 24 h, 48 h, 1 wk, and 2 wk afterward. Lesions were scored on a scale of 0 to 5 on pigs in the identification, tail docking, and castration treatments at 24 h, 1 wk, and 2 wk postprocedure. For teeth resection, grinding took longer than clipping and resulted in greater cortisol concentration overall, poorer growth rates, and longer vocalizations compared with pigs in the control treatment (P < 0.05). For tail docking, hot clipping took longer, and hot-clipped piglets grew slower than cold-clipped piglets (P < 0.05). Hot clipping also resulted in longer and higher frequency squealing compared with pigs in the control treatment (P < 0.01). For identification, ear notching took longer than tagging, and ear-notched piglets had worse wound scores than tagged piglets (P < 0.05). Cortisol concentrations at 4 h also tended to be greater for ear-notched piglets (P < 0.10). Ear notching evoked calls with higher peak frequencies than the control treatments. For iron administration, oral delivery took numerically longer than injecting, but there were no significant differences between injecting and oral delivery for any of the measures. For castration, tearing took longer than cutting the cords (P < 0.05), but β-endorphin concentrations at 45 min postprocedure were greater for cut piglets. When measures of behavior, physiology, and productivity were used, the responses to teeth resection, tail docking, and identification were shown to be altered by the procedural method, whereas responses to iron administration and castration did not differ. The time taken to carry out the procedure would appear to be an important factor in the strength of the stress response.
100 first animals
Children's board book identifying one hundred animals, with colorful illustrations. Each illustration has one animal, with name of type of animal.
AI-enhanced real-time cattle identification system through tracking across various environments
Video-based monitoring is essential nowadays in cattle farm management systems for automated evaluation of cow health, encompassing body condition scores, lameness detection, calving events, and other factors. In order to efficiently monitor the well-being of each individual animal, it is vital to automatically identify them in real time. Although there are various techniques available for cattle identification, a significant number of them depend on radio frequency or visible ear tags, which are prone to being lost or damaged. This can result in financial difficulties for farmers. Therefore, this paper presents a novel method for tracking and identifying the cattle with an RGB image-based camera. As a first step, to detect the cattle in the video, we employ the YOLOv8 (You Only Look Once) model. The sample data contains the raw video that was recorded with the cameras that were installed at above from the designated lane used by cattle after the milk production process and above from the rotating milking parlor. As a second step, the detected cattle are continuously tracked and assigned unique local IDs. The tracked images of each individual cattle are then stored in individual folders according to their respective IDs, facilitating the identification process. The images of each folder will be the features which are extracted using a feature extractor called VGG (Visual Geometry Group). After feature extraction task, as a final step, the SVM (Support Vector Machine) identifier for cattle identification will be used to get the identified ID of the cattle. The final ID of a cattle is determined based on the maximum identified output ID from the tracked images of that particular animal. The outcomes of this paper will act as proof of the concept for the use of combining VGG features with SVM is an effective and promising approach for an automatic cattle identification system
Multistage pig identification using a sequential ear tag detection pipeline
Reliable animal identification in livestock husbandry is essential for various applications, including behavioral monitoring, welfare assessment, and the analysis of social structures. Although recent advancements in deep learning models have improved animal identification using biometric markers, their applicability remains limited for species without distinctive traits like pigs. Consequently, synthetic features such as ear tags have become widely adopted. However, challenges such as poor lighting conditions and the complexity of ear tag coding continue to restrict the effectiveness of Computer Vision and Deep Learning techniques. In this study, we introduce a robust, lighting-invariant method for individual pig identification that leverages commercially available ear tags within a sequential detection pipeline. Our approach employs four object detection models in succession to detect pigs, localize ear tags, perform rotation correction via pin detection, and recognize digits, ultimately generating a reliable ID proposal. In a first evaluation stage, we assessed the performance of each model independently, achieving a mAP0.95 value of 0.970, 0.979, 0.974 and 0.979 for the pig detection, ear tag detection, pin detection and ID classification model, respectively. In addition, our method was further evaluated in two different camera environments to assess its performance in both familiar and unfamiliar conditions. The results demonstrate that the proposed approach achieves a very high precision of 0.996 in a familiar top-down camera scenario and maintained a strong generalization performance in an unfamiliar, close-up setup with a precision of 0.913 and a recall of 0.903. Furthermore, by publicly proposing three custom datasets for ear tag, pin, and digit detection, we aim to support reproducibility and further research in automated animal identification for precision livestock farming. The findings of this study demonstrate the effectiveness of ID-based animal identification and the proposed method could be integrated within advanced multi-object tracking systems to enable continuous animal observation and for monitoring specific target areas, thereby significantly enhancing overall livestock management systems.