Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Teacher-student collaborative knowledge distillation for image classification
by
Zhang, Yang
, Xu, Chuanyun
, Li, Tian
, Li, Gang
, Gao, Wenjian
, Bai, Nanlan
in
Classification
/ Collaboration
/ Datasets
/ Distillation
/ Image classification
/ Knowledge
/ Learning
/ Neural networks
/ Regularization methods
/ Teachers
/ Teaching
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Teacher-student collaborative knowledge distillation for image classification
by
Zhang, Yang
, Xu, Chuanyun
, Li, Tian
, Li, Gang
, Gao, Wenjian
, Bai, Nanlan
in
Classification
/ Collaboration
/ Datasets
/ Distillation
/ Image classification
/ Knowledge
/ Learning
/ Neural networks
/ Regularization methods
/ Teachers
/ Teaching
2023
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Teacher-student collaborative knowledge distillation for image classification
by
Zhang, Yang
, Xu, Chuanyun
, Li, Tian
, Li, Gang
, Gao, Wenjian
, Bai, Nanlan
in
Classification
/ Collaboration
/ Datasets
/ Distillation
/ Image classification
/ Knowledge
/ Learning
/ Neural networks
/ Regularization methods
/ Teachers
/ Teaching
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Teacher-student collaborative knowledge distillation for image classification
Journal Article
Teacher-student collaborative knowledge distillation for image classification
2023
Request Book From Autostore
and Choose the Collection Method
Overview
A single model usually cannot learn all the appropriate features with limited data, thus leading to poor performance when test data are used. To improve model performance, we propose a teacher-student collaborative knowledge distillation (TSKD) method based on knowledge distillation and self-distillation. The method consists of two parts: learning in the teacher network and self-teaching in the student network. Learning in the teacher network allows the student network to use knowledge from the teacher network. Self-teaching in the student network is to build a multi-exit network based on self-distillation and provide deep features as supervised information for training. In the inference stage, we use ensembles to vote on the classification results of multiple sub-models in the student network. The experimental results demonstrate the superior performance of our method compared with a traditional knowledge distillation method and a self-distillation-based multi-exit network.
Publisher
Springer Nature B.V
Subject
This website uses cookies to ensure you get the best experience on our website.