Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
A comprehensive survey of dataset distillation
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …
become the primary choice in many application domains. This progress is mainly attributed …
Slimmable dataset condensation
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
Dataset quantization
State-of-the-art deep neural networks are trained with large amounts (millions or even
billions) of data. The expensive computation and memory costs make it difficult to train them …
billions) of data. The expensive computation and memory costs make it difficult to train them …
Dream: Efficient dataset distillation by representative matching
Dataset distillation aims to synthesize small datasets with little information loss from original
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
On the diversity and realism of distilled dataset: An efficient dataset distillation paradigm
Contemporary machine learning which involves training large neural networks on massive
datasets faces significant computational challenges. Dataset distillation as a recent …
datasets faces significant computational challenges. Dataset distillation as a recent …
You only condense once: Two rules for pruning condensed datasets
Dataset condensation is a crucial tool for enhancing training efficiency by reducing the size
of the training dataset, particularly in on-device scenarios. However, these scenarios have …
of the training dataset, particularly in on-device scenarios. However, these scenarios have …
Towards lossless dataset distillation via difficulty-aligned trajectory matching
The ultimate goal of Dataset Distillation is to synthesize a small synthetic dataset such that a
model trained on this synthetic set will perform equally well as a model trained on the full …
model trained on this synthetic set will perform equally well as a model trained on the full …
Sequential subset matching for dataset distillation
Dataset distillation is a newly emerging task that synthesizes a small-size dataset used in
training deep neural networks (DNNs) for reducing data storage and model training costs …
training deep neural networks (DNNs) for reducing data storage and model training costs …
Generalized large-scale data condensation via various backbone and statistical matching
The lightweight" local-match-global" matching introduced by SRe2L successfully creates a
distilled dataset with comprehensive information on the full 224x224 ImageNet-1k. However …
distilled dataset with comprehensive information on the full 224x224 ImageNet-1k. However …