Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Knowledge Distillation Meets Open-Set Semi-supervised Learning
by
Martinez, Brais
, Bulat, Adrian
, Yang, Jing
, Tzimiropoulos, Georgios
, Zhu, Xiatian
in
Analysis
/ Artificial Intelligence
/ Combinatorial analysis
/ Computer Imaging
/ Computer Science
/ Face recognition
/ Image Processing and Computer Vision
/ Knowledge
/ Knowledge representation
/ Machine learning
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Semantics
/ Semi-supervised learning
/ Source code
/ Teachers
/ Vision
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Knowledge Distillation Meets Open-Set Semi-supervised Learning
by
Martinez, Brais
, Bulat, Adrian
, Yang, Jing
, Tzimiropoulos, Georgios
, Zhu, Xiatian
in
Analysis
/ Artificial Intelligence
/ Combinatorial analysis
/ Computer Imaging
/ Computer Science
/ Face recognition
/ Image Processing and Computer Vision
/ Knowledge
/ Knowledge representation
/ Machine learning
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Semantics
/ Semi-supervised learning
/ Source code
/ Teachers
/ Vision
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Knowledge Distillation Meets Open-Set Semi-supervised Learning
by
Martinez, Brais
, Bulat, Adrian
, Yang, Jing
, Tzimiropoulos, Georgios
, Zhu, Xiatian
in
Analysis
/ Artificial Intelligence
/ Combinatorial analysis
/ Computer Imaging
/ Computer Science
/ Face recognition
/ Image Processing and Computer Vision
/ Knowledge
/ Knowledge representation
/ Machine learning
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Semantics
/ Semi-supervised learning
/ Source code
/ Teachers
/ Vision
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Knowledge Distillation Meets Open-Set Semi-supervised Learning
Journal Article
Knowledge Distillation Meets Open-Set Semi-supervised Learning
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Existing knowledge distillation methods mostly focus on distillation of teacher’s prediction and intermediate activation. However, the structured representation, which arguably is one of the most critical ingredients of deep models, is largely overlooked. In this work, we propose a novel semantic representational distillation (SRD) method dedicated for distilling representational knowledge semantically from a pretrained teacher to a target student. The key idea is that we leverage the teacher’s classifier as a semantic critic for evaluating the representations of both teacher and student and distilling the semantic knowledge with high-order structured information over all feature dimensions. This is accomplished by introducing a notion of cross-network logit computed through passing student’s representation into teacher’s classifier. Further, considering the set of seen classes as a basis for the semantic space in a combinatorial perspective, we scale SRD to unseen classes for enabling effective exploitation of largely available, arbitrary unlabeled training data. At the problem level, this establishes an interesting connection between knowledge distillation with open-set semi-supervised learning (SSL). Extensive experiments show that our SRD outperforms significantly previous state-of-the-art knowledge distillation methods on both coarse object classification and fine face recognition tasks, as well as less studied yet practically crucial binary network distillation. Under more realistic open-set SSL settings we introduce, we reveal that knowledge distillation is generally more effective than existing out-of-distribution sample detection, and our proposed SRD is superior over both previous distillation and SSL competitors. The source code is available at
https://github.com/jingyang2017/SRD_ossl
.
Publisher
Springer US,Springer,Springer Nature B.V
This website uses cookies to ensure you get the best experience on our website.