Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need

DW Zhou, ZW Cai, HJ Ye, DC Zhan, Z Liu - arxiv preprint arxiv …, 2023 - arxiv.org
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …

Class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Expandable subspace ensemble for pre-trained model-based class-incremental learning

DW Zhou, HL Sun, HJ Ye… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) requires a learning system to continually learn
new classes without forgetting. Despite the strong performance of Pre-Trained Models …

Trends and challenges of real-time learning in large language models: A critical review

M Jovanovic, P Voss - arxiv preprint arxiv:2404.18311, 2024 - arxiv.org
Real-time learning concerns the ability of learning systems to acquire knowledge over time,
enabling their adaptation and generalization to novel tasks. It is a critical ability for …

Towards General Industrial Intelligence: A Survey on IIoT-Enhanced Continual Large Models

J Chen, J He, F Chen, Z Lv, J Tang, W Li, Z Liu… - arxiv preprint arxiv …, 2024 - arxiv.org
Currently, most applications in the Industrial Internet of Things (IIoT) still rely on CNN-based
neural networks. Although Transformer-based large models (LMs), including language …

Exemplar-free continual representation learning via learnable drift compensation

A Gomez-Villa, D Goswami, K Wang… - … on Computer Vision, 2024 - Springer
Exemplar-free class-incremental learning using a backbone trained from scratch and
starting from a small first task presents a significant challenge for continual representation …

A Practitioner's Guide to Continual Multimodal Pretraining

K Roth, V Udandarao, S Dziadzio, A Prabhu… - arxiv preprint arxiv …, 2024 - arxiv.org
Multimodal foundation models serve numerous applications at the intersection of vision and
language. Still, despite being pretrained on extensive data, they become outdated over time …

Calibrating Higher-Order Statistics for Few-Shot Class-Incremental Learning with Pre-trained Vision Transformers

D Goswami, B Twardowski… - Proceedings of the …, 2024 - openaccess.thecvf.com
Few-shot class-incremental learning (FSCIL) aims to adapt the model to new classes from
very few data (5 samples) without forgetting the previously learned classes. Recent works in …

Sparse Orthogonal Parameters Tuning for Continual Learning

KP Ning, HJ Ke, YY Liu, JY Yao, YH Tian… - arxiv preprint arxiv …, 2024 - arxiv.org
Continual learning methods based on pre-trained models (PTM) have recently gained
attention which adapt to successive downstream tasks without catastrophic forgetting. These …

Premonition: Using generative models to preempt future data changes in continual learning

MD McDonnell, D Gong, E Abbasnejad… - arxiv preprint arxiv …, 2024 - arxiv.org
Continual learning requires a model to adapt to ongoing changes in the data distribution,
and often to the set of tasks to be performed. It is rare, however, that the data and task …