[PDF][PDF] Open-environment machine learning
ZH Zhou - National Science Review, 2022 - academic.oup.com
Conventional machine learning studies generally assume close-environment scenarios
where important factors of the learning process hold invariant. With the great success of …
where important factors of the learning process hold invariant. With the great success of …
A comprehensive survey of forgetting in deep learning beyond continual learning
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
Three types of incremental learning
Incrementally learning new information from a non-stationary stream of data, referred to as
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
Dualprompt: Complementary prompting for rehearsal-free continual learning
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
A continual learning survey: Defying forgetting in classification tasks
Artificial neural networks thrive in solving the classification problem for a particular rigid task,
acquiring knowledge through generalized learning behaviour from a distinct training phase …
acquiring knowledge through generalized learning behaviour from a distinct training phase …
Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
Foster: Feature boosting and compression for class-incremental learning
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …
However, deep neural networks suffer from catastrophic forgetting when learning new …
Class-incremental learning: A survey
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
in many vision tasks in the closed world. However, novel classes emerge from time to time in …
Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
Computationally budgeted continual learning: What does matter?
Continual Learning (CL) aims to sequentially train models on streams of incoming data that
vary in distribution by preserving previous knowledge while adapting to new data. Current …
vary in distribution by preserving previous knowledge while adapting to new data. Current …