Distributed artificial intelligence empowered by end-edge-cloud computing: A survey
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …
also supports artificial intelligence evolving from a centralized manner to a distributed one …
Shifting machine learning for healthcare from development to deployment and from models to data
In the past decade, the application of machine learning (ML) to healthcare has helped drive
the automation of physician tasks as well as enhancements in clinical capabilities and …
the automation of physician tasks as well as enhancements in clinical capabilities and …
Dualprompt: Complementary prompting for rehearsal-free continual learning
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …
Dataset distillation via factorization
In this paper, we study dataset distillation (DD), from a novel perspective and introduce
a\emph {dataset factorization} approach, termed\emph {HaBa}, which is a plug-and-play …
a\emph {dataset factorization} approach, termed\emph {HaBa}, which is a plug-and-play …
Red teaming language models with language models
Language Models (LMs) often cannot be deployed because of their potential to harm users
in hard-to-predict ways. Prior work identifies harmful behaviors before deployment by using …
in hard-to-predict ways. Prior work identifies harmful behaviors before deployment by using …
Learning to prompt for continual learning
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
parameters to non-stationary data distributions, where catastrophic forgetting is the central …
Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
Slimmable dataset condensation
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
Federated learning on non-IID data: A survey
Federated learning is an emerging distributed machine learning framework for privacy
preservation. However, models trained in federated learning usually have worse …
preservation. However, models trained in federated learning usually have worse …
Swarm learning for decentralized and confidential clinical machine learning
S Warnat-Herresthal, H Schultze, KL Shastry… - Nature, 2021 - nature.com
Fast and reliable detection of patients with severe and heterogeneous illnesses is a major
goal of precision medicine,. Patients with leukaemia can be identified using machine …
goal of precision medicine,. Patients with leukaemia can be identified using machine …