Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE transactions on pattern analysis …, 2023‏ - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

A comprehensive survey of dataset distillation

S Lei, D Tao - IEEE Transactions on Pattern Analysis and …, 2023‏ - ieeexplore.ieee.org
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …

Slimmable dataset condensation

S Liu, J Ye, R Yu, X Wang - … of the IEEE/CVF Conference on …, 2023‏ - openaccess.thecvf.com
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …

Data distillation: A survey

N Sachdeva, J McAuley - arxiv preprint arxiv:2301.04272, 2023‏ - arxiv.org
The popularity of deep learning has led to the curation of a vast number of massive and
multifarious datasets. Despite having close-to-human performance on individual tasks …

Accelerating dataset distillation via model augmentation

L Zhang, J Zhang, B Lei, S Mukherjee… - Proceedings of the …, 2023‏ - openaccess.thecvf.com
Dataset Distillation (DD), a newly emerging field, aims at generating much smaller but
efficient synthetic training datasets from large ones. Existing DD methods based on gradient …

Data-centric green artificial intelligence: A survey

S Salehi, A Schmeink - IEEE Transactions on Artificial …, 2023‏ - ieeexplore.ieee.org
With the exponential growth of computational power and the availability of large-scale
datasets in recent years, remarkable advancements have been made in the field of artificial …

V2xp-asg: Generating adversarial scenes for vehicle-to-everything perception

H **ang, R Xu, X **a, Z Zheng… - 2023 IEEE International …, 2023‏ - ieeexplore.ieee.org
Recent advancements in Vehicle-to-Everything communication technology have enabled
autonomous vehicles to share sensory information to obtain better perception performance …

A survey of what to share in federated learning: Perspectives on model utility, privacy leakage, and communication efficiency

J Shao, Z Li, W Sun, T Zhou, Y Sun, L Liu, Z Lin… - arxiv preprint arxiv …, 2023‏ - arxiv.org
Federated learning (FL) has emerged as a secure paradigm for collaborative training among
clients. Without data centralization, FL allows clients to share local information in a privacy …

Backdoor attacks against dataset distillation

Y Liu, Z Li, M Backes, Y Shen, Y Zhang - arxiv preprint arxiv:2301.01197, 2023‏ - arxiv.org
Dataset distillation has emerged as a prominent technique to improve data efficiency when
training machine learning models. It encapsulates the knowledge from a large dataset into a …

Dataset distillation by automatic training trajectories

D Liu, J Gu, H Cao, C Trinitis, M Schulz - European Conference on …, 2024‏ - Springer
Dataset Distillation is used to create a concise, yet informative, synthetic dataset that can
replace the original dataset for training purposes. Some leading methods in this domain …