A comprehensive study of class incremental learning algorithms for visual tasks

E Belouadah, A Popescu, I Kanellos - Neural Networks, 2021 - Elsevier
The ability of artificial agents to increment their capabilities when confronted with new data is
an open challenge in artificial intelligence. The main challenge faced in such cases is …

Adding conditional control to text-to-image diffusion models

L Zhang, A Rao, M Agrawala - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
We present ControlNet, a neural network architecture to add spatial conditioning controls to
large, pretrained text-to-image diffusion models. ControlNet locks the production-ready large …

[HTML][HTML] A survey on few-shot class-incremental learning

S Tian, L Li, W Li, H Ran, X Ning, P Tiwari - Neural Networks, 2024 - Elsevier
Large deep learning models are impressive, but they struggle when real-time data is not
available. Few-shot class-incremental learning (FSCIL) poses a significant challenge for …

Vision transformer adapter for dense predictions

Z Chen, Y Duan, W Wang, J He, T Lu, J Dai… - arxiv preprint arxiv …, 2022 - arxiv.org
This work investigates a simple yet powerful adapter for Vision Transformer (ViT). Unlike
recent visual transformers that introduce vision-specific inductive biases into their …

Parameter-efficient transfer learning for NLP

N Houlsby, A Giurgiu, S Jastrzebski… - International …, 2019 - proceedings.mlr.press
Fine-tuning large pretrained models is an effective transfer mechanism in NLP. However, in
the presence of many downstream tasks, fine-tuning is parameter inefficient: an entire new …

A continual learning survey: Defying forgetting in classification tasks

M De Lange, R Aljundi, M Masana… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Artificial neural networks thrive in solving the classification problem for a particular rigid task,
acquiring knowledge through generalized learning behaviour from a distinct training phase …

Class-incremental learning: survey and performance evaluation on image classification

M Masana, X Liu, B Twardowski… - … on Pattern Analysis …, 2022 - ieeexplore.ieee.org
For future learning systems, incremental learning is desirable because it allows for: efficient
resource usage by eliminating the need to retrain from scratch at the arrival of new data; …

End-to-end multi-task learning with attention

S Liu, E Johns, AJ Davison - … of the IEEE/CVF conference on …, 2019 - openaccess.thecvf.com
We propose a novel multi-task learning architecture, which allows learning of task-specific
feature-level attention. Our design, the Multi-Task Attention Network (MTAN), consists of a …

Learning multiple visual domains with residual adapters

SA Rebuffi, H Bilen, A Vedaldi - Advances in neural …, 2017 - proceedings.neurips.cc
There is a growing interest in learning data representations that work well for many different
types of problems and data. In this paper, we look in particular at the task of learning a single …

Piggyback: Adapting a single network to multiple tasks by learning to mask weights

A Mallya, D Davis, S Lazebnik - Proceedings of the …, 2018 - openaccess.thecvf.com
This work presents a method for adapting a single, fixed deep neural network to multiple
tasks without affecting performance on already learned tasks. By building upon ideas from …