Llama 2: Open foundation and fine-tuned chat models

H Touvron, L Martin, K Stone, P Albert… - arxiv preprint arxiv …, 2023 - arxiv.org
In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large
language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. Our fine …

Learn from others and be yourself in heterogeneous federated learning

W Huang, M Ye, B Du - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
Federated learning has emerged as an important distributed learning paradigm, which
normally involves collaborative updating with others and local updating on private data …

A continual learning survey: Defying forgetting in classification tasks

M De Lange, R Aljundi, M Masana… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Artificial neural networks thrive in solving the classification problem for a particular rigid task,
acquiring knowledge through generalized learning behaviour from a distinct training phase …

Coda-prompt: Continual decomposed attention-based prompting for rehearsal-free continual learning

JS Smith, L Karlinsky, V Gutta… - Proceedings of the …, 2023 - openaccess.thecvf.com
Computer vision models suffer from a phenomenon known as catastrophic forgetting when
learning novel concepts from continuously shifting training data. Typical solutions for this …

Orthogonal gradient descent for continual learning

M Farajtabar, N Azizan, A Mott… - … Conference on Artificial …, 2020 - proceedings.mlr.press
Neural networks are achieving state of the art and sometimes super-human performance on
learning tasks across a variety of domains. Whenever these problems require learning in a …

A critical review on the state-of-the-art and future prospects of Machine Learning for Earth Observation Operations

P Miralles, K Thangavel, AF Scannapieco… - Advances in Space …, 2023 - Elsevier
Abstract The continuing Machine Learning (ML) revolution indubitably has had a significant
positive impact on the analysis of downlinked satellite data. Other aspects of the Earth …

Probing representation forgetting in supervised and unsupervised continual learning

MR Davari, N Asadi, S Mudur… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …

Leep: A new measure to evaluate transferability of learned representations

C Nguyen, T Hassner, M Seeger… - International …, 2020 - proceedings.mlr.press
We introduce a new measure to evaluate the transferability of representations learned by
classifiers. Our measure, the Log Expected Empirical Prediction (LEEP), is simple and easy …

Understanding the role of training regimes in continual learning

SI Mirzadeh, M Farajtabar, R Pascanu… - Advances in …, 2020 - proceedings.neurips.cc
Catastrophic forgetting affects the training of neural networks, limiting their ability to learn
multiple tasks sequentially. From the perspective of the well established plasticity-stability …

When do curricula work?

X Wu, E Dyer, B Neyshabur - arxiv preprint arxiv:2012.03107, 2020 - arxiv.org
Inspired by human learning, researchers have proposed ordering examples during training
based on their difficulty. Both curriculum learning, exposing a network to easier examples …