Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
Class-incremental learning: A survey
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
Expandable subspace ensemble for pre-trained model-based class-incremental learning
Abstract Class-Incremental Learning (CIL) requires a learning system to continually learn
new classes without forgetting. Despite the strong performance of Pre-Trained Models …
new classes without forgetting. Despite the strong performance of Pre-Trained Models …
Trends and challenges of real-time learning in large language models: A critical review
M Jovanovic, P Voss - arxiv preprint arxiv:2404.18311, 2024 - arxiv.org
Real-time learning concerns the ability of learning systems to acquire knowledge over time,
enabling their adaptation and generalization to novel tasks. It is a critical ability for …
enabling their adaptation and generalization to novel tasks. It is a critical ability for …
Towards General Industrial Intelligence: A Survey on IIoT-Enhanced Continual Large Models
Currently, most applications in the Industrial Internet of Things (IIoT) still rely on CNN-based
neural networks. Although Transformer-based large models (LMs), including language …
neural networks. Although Transformer-based large models (LMs), including language …
Exemplar-free continual representation learning via learnable drift compensation
Exemplar-free class-incremental learning using a backbone trained from scratch and
starting from a small first task presents a significant challenge for continual representation …
starting from a small first task presents a significant challenge for continual representation …
A Practitioner's Guide to Continual Multimodal Pretraining
Multimodal foundation models serve numerous applications at the intersection of vision and
language. Still, despite being pretrained on extensive data, they become outdated over time …
language. Still, despite being pretrained on extensive data, they become outdated over time …
Calibrating Higher-Order Statistics for Few-Shot Class-Incremental Learning with Pre-trained Vision Transformers
D Goswami, B Twardowski… - Proceedings of the …, 2024 - openaccess.thecvf.com
Few-shot class-incremental learning (FSCIL) aims to adapt the model to new classes from
very few data (5 samples) without forgetting the previously learned classes. Recent works in …
very few data (5 samples) without forgetting the previously learned classes. Recent works in …
Sparse Orthogonal Parameters Tuning for Continual Learning
Continual learning methods based on pre-trained models (PTM) have recently gained
attention which adapt to successive downstream tasks without catastrophic forgetting. These …
attention which adapt to successive downstream tasks without catastrophic forgetting. These …
Premonition: Using generative models to preempt future data changes in continual learning
Continual learning requires a model to adapt to ongoing changes in the data distribution,
and often to the set of tasks to be performed. It is rare, however, that the data and task …
and often to the set of tasks to be performed. It is rare, however, that the data and task …