Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE transactions on pattern analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

A comprehensive survey of dataset distillation

S Lei, D Tao - IEEE Transactions on Pattern Analysis and …, 2023 - ieeexplore.ieee.org
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …

Dream: Efficient dataset distillation by representative matching

Y Liu, J Gu, K Wang, Z Zhu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Dataset distillation aims to synthesize small datasets with little information loss from original
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …

On the diversity and realism of distilled dataset: An efficient dataset distillation paradigm

P Sun, B Shi, D Yu, T Lin - … of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
Contemporary machine learning which involves training large neural networks on massive
datasets faces significant computational challenges. Dataset distillation as a recent …

Squeeze, recover and relabel: Dataset condensation at imagenet scale from a new perspective

Z Yin, E **ng, Z Shen - Advances in Neural Information …, 2023 - proceedings.neurips.cc
We present a new dataset condensation framework termed Squeeze, Recover and Relabel
(SRe $^ 2$ L) that decouples the bilevel optimization of model and synthetic data during …

Slimmable dataset condensation

S Liu, J Ye, R Yu, X Wang - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …

Data distillation: A survey

N Sachdeva, J McAuley - arxiv preprint arxiv:2301.04272, 2023 - arxiv.org
The popularity of deep learning has led to the curation of a vast number of massive and
multifarious datasets. Despite having close-to-human performance on individual tasks …

Towards lossless dataset distillation via difficulty-aligned trajectory matching

Z Guo, K Wang, G Cazenavette, H Li, K Zhang… - arxiv preprint arxiv …, 2023 - arxiv.org
The ultimate goal of Dataset Distillation is to synthesize a small synthetic dataset such that a
model trained on this synthetic set will perform equally well as a model trained on the full …

Sequential subset matching for dataset distillation

J Du, Q Shi, JT Zhou - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Dataset distillation is a newly emerging task that synthesizes a small-size dataset used in
training deep neural networks (DNNs) for reducing data storage and model training costs …

Efficient dataset distillation via minimax diffusion

J Gu, S Vahidian, V Kungurtsev… - Proceedings of the …, 2024 - openaccess.thecvf.com
Dataset distillation reduces the storage and computational consumption of training a
network by generating a small surrogate dataset that encapsulates rich information of the …