Llama 2: Open foundation and fine-tuned chat models
In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large
language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine …
language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine …
Learn from others and be yourself in heterogeneous federated learning
Federated learning has emerged as an important distributed learning paradigm, which
normally involves collaborative updating with others and local updating on private data …
normally involves collaborative updating with others and local updating on private data …
A continual learning survey: Defying forgetting in classification tasks
Artificial neural networks thrive in solving the classification problem for a particular rigid task,
acquiring knowledge through generalized learning behaviour from a distinct training phase …
acquiring knowledge through generalized learning behaviour from a distinct training phase …
Coda-prompt: Continual decomposed attention-based prompting for rehearsal-free continual learning
Computer vision models suffer from a phenomenon known as catastrophic forgetting when
learning novel concepts from continuously shifting training data. Typical solutions for this …
learning novel concepts from continuously shifting training data. Typical solutions for this …
Orthogonal gradient descent for continual learning
Neural networks are achieving state of the art and sometimes super-human performance on
learning tasks across a variety of domains. Whenever these problems require learning in a …
learning tasks across a variety of domains. Whenever these problems require learning in a …
A critical review on the state-of-the-art and future prospects of Machine Learning for Earth Observation Operations
Abstract The continuing Machine Learning (ML) revolution indubitably has had a significant
positive impact on the analysis of downlinked satellite data. Other aspects of the Earth …
positive impact on the analysis of downlinked satellite data. Other aspects of the Earth …
Probing representation forgetting in supervised and unsupervised continual learning
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …
Leep: A new measure to evaluate transferability of learned representations
We introduce a new measure to evaluate the transferability of representations learned by
classifiers. Our measure, the Log Expected Empirical Prediction (LEEP), is simple and easy …
classifiers. Our measure, the Log Expected Empirical Prediction (LEEP), is simple and easy …
Understanding the role of training regimes in continual learning
Catastrophic forgetting affects the training of neural networks, limiting their ability to learn
multiple tasks sequentially. From the perspective of the well established plasticity-stability …
multiple tasks sequentially. From the perspective of the well established plasticity-stability …
When do curricula work?
Inspired by human learning, researchers have proposed ordering examples during training
based on their difficulty. Both curriculum learning, exposing a network to easier examples …
based on their difficulty. Both curriculum learning, exposing a network to easier examples …