Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Revisiting Consistency Regularization for Semi-Supervised Learning
by
Dai, Dengxin
, Fan, Yue
, Kukleva, Anna
, Schiele, Bernt
in
Benchmarks
/ Consistency
/ Datasets
/ Informatics
/ Methods
/ Performance enhancement
/ Regularization
/ Semantics
/ Semi-supervised learning
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Revisiting Consistency Regularization for Semi-Supervised Learning
by
Dai, Dengxin
, Fan, Yue
, Kukleva, Anna
, Schiele, Bernt
in
Benchmarks
/ Consistency
/ Datasets
/ Informatics
/ Methods
/ Performance enhancement
/ Regularization
/ Semantics
/ Semi-supervised learning
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Revisiting Consistency Regularization for Semi-Supervised Learning
Journal Article
Revisiting Consistency Regularization for Semi-Supervised Learning
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Consistency regularization is one of the most widely-used techniques for semi-supervised learning (SSL). Generally, the aim is to train a model that is invariant to various data augmentations. In this paper, we revisit this idea and find that enforcing invariance by decreasing distances between features from differently augmented images leads to improved performance. However, encouraging equivariance instead, by increasing the feature distance, further improves performance. To this end, we propose an improved consistency regularization framework by a simple yet effective technique, FeatDistLoss, that imposes consistency and equivariance on the classifier and the feature level, respectively. Experimental results show that our model defines a new state of the art across a variety of standard semi-supervised learning benchmarks as well as imbalanced semi-supervised learning benchmarks. Particularly, we outperform previous work by a significant margin in low data regimes and at large imbalance ratios. Extensive experiments are conducted to analyze the method, and the code will be published.
Publisher
Springer Nature B.V
Subject
MBRLCatalogueRelatedBooks
Related Items
Related Items
We currently cannot retrieve any items related to this title. Kindly check back at a later time.
This website uses cookies to ensure you get the best experience on our website.