Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
A comprehensive survey of dataset distillation
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …
become the primary choice in many application domains. This progress is mainly attributed …
Dream: Efficient dataset distillation by representative matching
Dataset distillation aims to synthesize small datasets with little information loss from original
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
On the diversity and realism of distilled dataset: An efficient dataset distillation paradigm
Contemporary machine learning which involves training large neural networks on massive
datasets faces significant computational challenges. Dataset distillation as a recent …
datasets faces significant computational challenges. Dataset distillation as a recent …
Squeeze, recover and relabel: Dataset condensation at imagenet scale from a new perspective
Z Yin, E **ng, Z Shen - Advances in Neural Information …, 2023 - proceedings.neurips.cc
We present a new dataset condensation framework termed Squeeze, Recover and Relabel
(SRe $^ 2$ L) that decouples the bilevel optimization of model and synthetic data during …
(SRe $^ 2$ L) that decouples the bilevel optimization of model and synthetic data during …
Slimmable dataset condensation
Dataset distillation, also known as dataset condensation, aims to compress a large dataset
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
Data distillation: A survey
The popularity of deep learning has led to the curation of a vast number of massive and
multifarious datasets. Despite having close-to-human performance on individual tasks …
multifarious datasets. Despite having close-to-human performance on individual tasks …
Towards lossless dataset distillation via difficulty-aligned trajectory matching
The ultimate goal of Dataset Distillation is to synthesize a small synthetic dataset such that a
model trained on this synthetic set will perform equally well as a model trained on the full …
model trained on this synthetic set will perform equally well as a model trained on the full …
Sequential subset matching for dataset distillation
J Du, Q Shi, JT Zhou - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Dataset distillation is a newly emerging task that synthesizes a small-size dataset used in
training deep neural networks (DNNs) for reducing data storage and model training costs …
training deep neural networks (DNNs) for reducing data storage and model training costs …
Efficient dataset distillation via minimax diffusion
Dataset distillation reduces the storage and computational consumption of training a
network by generating a small surrogate dataset that encapsulates rich information of the …
network by generating a small surrogate dataset that encapsulates rich information of the …