Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
2
result(s) for
"HS-FPN"
Sort by:
A Lightweight Pig Aggressive Behavior Recognition Model by Effective Integration of Spatio-Temporal Features
2025
With the rise of smart agriculture and the expansion of pig farming, pig aggressive behavior recognition is crucial for maintaining herd health and improving farming efficiency. The differences in background and light variation in different barns can lead to the missed detection and false detection of pig aggressive behaviors. Therefore, we propose a deep learning-based pig aggressive behavior recognition model, in order to improve the adaptability of the model in complex pig environments. This model, combined with MobileNetV2 and Autoformer, can effectively extract local detail features of pig aggression and temporal correlation information of video frame sequences. Both Convolutional Block Attention Module (CBAM) and Advanced Filtering Feature Fusion Pyramid Network (HS-FPN) are integrated into the lightweight convolutional network MobileNetV2, which can more accurately capture key visual features of pig aggression and enhance the ability to detect small targets. We extract temporal correlation information between consecutive frames by the improved Autoformer. The Gate Attention Unit (GAU) is embedded into the Autoformer encoder in order to focus on important features of pig aggression while reducing computational latency. Experimental validation was implemented on public datasets, and the results showed that the classification recall, precision, accuracy, and F1-score of the model proposed in this paper reach 98.08%, 94.44%, 96.23%, and 96.23%, and the parameter quantity is optimized to 10.41 M. Compared with MobileNetV2-LSTM and MobileNetV2-GRU, the accuracy has been improved by 3.5% and 3.0%, respectively. Therefore, this model achieves a balance between recognition accuracy and computational complexity and is more suitable for automatic pig aggression recognition in practical farming scenarios, providing data support for scientific feeding and management strategies in pig farming.
Journal Article
BHI-YOLO: A Lightweight Instance Segmentation Model for Strawberry Diseases
2024
In complex environments, strawberry disease segmentation models face challenges, such as segmentation difficulties, excessive parameters, and high computational loads, making it difficult for these models to run effectively on devices with limited computational resources. To address the need for efficient running on low-power devices while ensuring effective disease segmentation in complex scenarios, this paper proposes BHI-YOLO, a lightweight instance segmentation model based on YOLOv8n-seg. First, the Universal Inverted Bottleneck (UIB) module is integrated into the backbone network and merged with the C2f module to create the C2f_UIB module; this approach reduces the parameter count while expanding the receptive field. Second, the HS-FPN is introduced to further reduce the parameter count and enhance the model’s ability to fuse features across different levels. Finally, by integrating the Inverted Residual Mobile Block (iRMB) with EMA to design the iRMA, the model is capable of efficiently combining global information to enhance local information. The experimental results demonstrate that the enhanced instance segmentation model for strawberry diseases achieved a mean average precision (mAP@50) of 93%. Compared to YOLOv8, which saw a 2.3% increase in mask mAP, the improved model reduced parameters by 47%, GFLOPs by 20%, and model size by 44.1%, achieving a relatively excellent lightweight effect. This study combines lightweight architecture with enhanced feature fusion, making the model more suitable for deployment on mobile devices, and provides a reference guide for strawberry disease segmentation applications in agricultural environments.
Journal Article