Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Infobatch: Lossless training speed up by unbiased dynamic data pruning
Data pruning aims to obtain lossless performances with less overall cost. A common
approach is to filter out samples that make less contribution to the training. This could lead to …
approach is to filter out samples that make less contribution to the training. This could lead to …
The stronger the diffusion model, the easier the backdoor: Data poisoning to induce copyright breaches without adjusting finetuning pipeline
The commercialization of text-to-image diffusion models (DMs) brings forth potential
copyright concerns. Despite numerous attempts to protect DMs from copyright issues, the …
copyright concerns. Despite numerous attempts to protect DMs from copyright issues, the …
Self-supervised dataset distillation: A good compression is all you need
Dataset distillation aims to compress information from a large-scale original dataset to a new
compact dataset while striving to preserve the utmost degree of the original data …
compact dataset while striving to preserve the utmost degree of the original data …
Dd-robustbench: An adversarial robustness benchmark for dataset distillation
Dataset distillation is an advanced technique aimed at compressing datasets into
significantly smaller counterparts, while preserving formidable training performance …
significantly smaller counterparts, while preserving formidable training performance …
Unlocking the potential of federated learning: The symphony of dataset distillation via deep generative latents
Data heterogeneity presents significant challenges for federated learning (FL). Recently,
dataset distillation techniques have been introduced, and performed at the client level, to …
dataset distillation techniques have been introduced, and performed at the client level, to …
Generative dataset distillation based on diffusion model
This paper presents our method for the generative track of The First Dataset Distillation
Challenge at ECCV 2024. Since the diffusion model has become the mainstay of generative …
Challenge at ECCV 2024. Since the diffusion model has become the mainstay of generative …
Group distributionally robust dataset distillation with risk minimization
Dataset distillation (DD) has emerged as a widely adopted technique for crafting a synthetic
dataset that captures the essential information of a training dataset, facilitating the training of …
dataset that captures the essential information of a training dataset, facilitating the training of …
Prioritize Alignment in Dataset Distillation
Dataset Distillation aims to compress a large dataset into a significantly more compact,
synthetic one without compromising the performance of the trained models. To achieve this …
synthetic one without compromising the performance of the trained models. To achieve this …
The Evolution of Dataset Distillation: Toward Scalable and Generalizable Solutions
Dataset distillation, which condenses large-scale datasets into compact synthetic
representations, has emerged as a critical solution for training modern deep learning …
representations, has emerged as a critical solution for training modern deep learning …
Emphasizing discriminative features for dataset distillation in complex scenarios
Dataset distillation has demonstrated strong performance on simple datasets like CIFAR,
MNIST, and TinyImageNet but struggles to achieve similar results in more complex …
MNIST, and TinyImageNet but struggles to achieve similar results in more complex …