Dd-robustbench: An adversarial robustness benchmark for dataset distillation

Y Wu, J Du, P Liu, Y Lin, W Xu, W Cheng - arxiv preprint arxiv:2403.13322, 2024 - arxiv.org
Dataset distillation is an advanced technique aimed at compressing datasets into
significantly smaller counterparts, while preserving formidable training performance …

Toward Mitigating Architecture Overfitting on Distilled Datasets

X Zhong, C Liu - IEEE Transactions on Neural Networks and …, 2025 - ieeexplore.ieee.org
Dataset distillation (DD) methods have demonstrated remarkable performance for neural
networks trained with very limited training data. However, a significant challenge arises in …