Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
by
Peng, Yangyang
, Hu, Haiying
, Liu, Ming
, Zhou, Hui
, Chen, Haoxuan
, Huang, Huamao
in
Accuracy
/ Agricultural production
/ Algorithms
/ Automation
/ Classification
/ Datasets
/ Deep learning
/ Distillation
/ Knowledge
/ knowledge distillation
/ Machine learning
/ Mechanization
/ multi-teacher model
/ Mushrooms
/ Neural networks
/ Oudemansiella raphanipes
/ Performance indices
/ quality grading
/ Teachers
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
by
Peng, Yangyang
, Hu, Haiying
, Liu, Ming
, Zhou, Hui
, Chen, Haoxuan
, Huang, Huamao
in
Accuracy
/ Agricultural production
/ Algorithms
/ Automation
/ Classification
/ Datasets
/ Deep learning
/ Distillation
/ Knowledge
/ knowledge distillation
/ Machine learning
/ Mechanization
/ multi-teacher model
/ Mushrooms
/ Neural networks
/ Oudemansiella raphanipes
/ Performance indices
/ quality grading
/ Teachers
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
by
Peng, Yangyang
, Hu, Haiying
, Liu, Ming
, Zhou, Hui
, Chen, Haoxuan
, Huang, Huamao
in
Accuracy
/ Agricultural production
/ Algorithms
/ Automation
/ Classification
/ Datasets
/ Deep learning
/ Distillation
/ Knowledge
/ knowledge distillation
/ Machine learning
/ Mechanization
/ multi-teacher model
/ Mushrooms
/ Neural networks
/ Oudemansiella raphanipes
/ Performance indices
/ quality grading
/ Teachers
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
Journal Article
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Oudemansiella raphanipes is valued for its rich nutritional content and medicinal properties, but traditional manual grading methods are time-consuming and labor-intensive. To address this, deep learning techniques are employed to automate the grading process, and knowledge distillation (KD) is used to enhance the accuracy of a small-parameter model while maintaining a low resource occupation and fast response speed in resource-limited devices. This study employs a three-teacher KD framework and investigates three cascaded structures: the parallel model, the standard series model, and the series model with residual connections (residual-series model). The student model used is a lightweight ShuffleNet V2 0.5x, while the teacher models are VGG16, ResNet50, and Xception. Our experiments show that the cascaded structures result in improved performance indices, compared with the traditional ensemble model with equal weights; in particular, the residual-series model outperforms the other models, achieving a grading accuracy of 99.7% on the testing dataset with an average inference time of 5.51 ms. The findings of this study have the potential for broader application of KD in resource-limited environments for automated quality grading.
This website uses cookies to ensure you get the best experience on our website.