Dd-robustbench: An adversarial robustness benchmark for dataset distillation
Dataset distillation is an advanced technique aimed at compressing datasets into
significantly smaller counterparts, while preserving formidable training performance …
significantly smaller counterparts, while preserving formidable training performance …
Toward Mitigating Architecture Overfitting on Distilled Datasets
X Zhong, C Liu - IEEE Transactions on Neural Networks and …, 2025 - ieeexplore.ieee.org
Dataset distillation (DD) methods have demonstrated remarkable performance for neural
networks trained with very limited training data. However, a significant challenge arises in …
networks trained with very limited training data. However, a significant challenge arises in …