Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need
by
Liu, Ziwei
, Ye, Han-Jia
, Zhan, De-Chuan
, Cai, Zi-Wen
, Zhou, Da-Wei
in
Adaptation
/ Artificial Intelligence
/ Benchmarks
/ Computer Imaging
/ Computer Science
/ Computer vision
/ Datasets
/ Image Processing and Computer Vision
/ Knowledge
/ Learning
/ Model updating
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Special Issue on Open-World Visual Recognition
/ Vision
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need
by
Liu, Ziwei
, Ye, Han-Jia
, Zhan, De-Chuan
, Cai, Zi-Wen
, Zhou, Da-Wei
in
Adaptation
/ Artificial Intelligence
/ Benchmarks
/ Computer Imaging
/ Computer Science
/ Computer vision
/ Datasets
/ Image Processing and Computer Vision
/ Knowledge
/ Learning
/ Model updating
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Special Issue on Open-World Visual Recognition
/ Vision
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need
by
Liu, Ziwei
, Ye, Han-Jia
, Zhan, De-Chuan
, Cai, Zi-Wen
, Zhou, Da-Wei
in
Adaptation
/ Artificial Intelligence
/ Benchmarks
/ Computer Imaging
/ Computer Science
/ Computer vision
/ Datasets
/ Image Processing and Computer Vision
/ Knowledge
/ Learning
/ Model updating
/ Pattern Recognition
/ Pattern Recognition and Graphics
/ Special Issue on Open-World Visual Recognition
/ Vision
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need
Journal Article
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting old ones. Traditional CIL models are trained from scratch to continually acquire knowledge as data evolves. Recently, pre-training has achieved substantial progress, making vast pre-trained models (PTMs) accessible for CIL. Contrary to traditional methods, PTMs possess generalizable embeddings, which can be easily transferred for CIL. In this work, we revisit CIL with PTMs and argue that the core factors in CIL are adaptivity for model updating and generalizability for knowledge transferring. (1) We first reveal that frozen PTM can already provide generalizable embeddings for CIL. Surprisingly, a simple baseline (SimpleCIL) which continually sets the classifiers of PTM to prototype features can beat state-of-the-art even without training on the downstream task. (2) Due to the distribution gap between pre-trained and downstream datasets, PTM can be further cultivated with adaptivity via model adaptation. We propose AdaPt and mERge (
Aper
), which aggregates the embeddings of PTM and adapted models for classifier construction.
Aper
is a general framework that can be orthogonally combined with any parameter-efficient tuning method, which holds the advantages of PTM’s generalizability and adapted model’s adaptivity. (3) Additionally, considering previous ImageNet-based benchmarks are unsuitable in the era of PTM due to data overlapping, we propose four new benchmarks for assessment, namely ImageNet-A, ObjectNet, OmniBenchmark, and VTAB. Extensive experiments validate the effectiveness of
Aper
with a unified and concise framework. Code is available at
https://github.com/zhoudw-zdw/RevisitingCIL
.
Publisher
Springer US,Springer,Springer Nature B.V
This website uses cookies to ensure you get the best experience on our website.