Survey: Image mixing and deleting for data augmentation

H Naveed, S Anwar, M Hayat, K Javed… - Engineering Applications of …, 2024 - Elsevier
Neural networks are prone to overfitting and memorizing data patterns. To avoid over-fitting
and enhance their generalization and performance, various methods have been suggested …

A survey of mix-based data augmentation: Taxonomy, methods, applications, and explainability

C Cao, F Zhou, Y Dai, J Wang, K Zhang - ACM Computing Surveys, 2024 - dl.acm.org
Data augmentation (DA) is indispensable in modern machine learning and deep neural
networks. The basic idea of DA is to construct new training data to improve the model's …

Robust heterogeneous federated learning under data corruption

X Fang, M Ye, X Yang - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Abstract Model heterogeneous federated learning is a realistic and challenging problem.
However, due to the limitations of data collection, storage, and transmission conditions, as …

Omg: Towards effective graph classification against label noise

N Yin, L Shen, M Wang, X Luo, Z Luo… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Graph classification is a fundamental problem with diverse applications in bioinformatics
and chemistry. Due to the intricate procedures of manual annotations in graphical domains …

Dart: Diversify-aggregate-repeat training improves generalization of neural networks

S Jain, S Addepalli, PK Sahu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Generalization of Neural Networks is crucial for deploying them safely in the real
world. Common training strategies to improve generalization involve the use of data …

Fedfa: Federated feature augmentation

T Zhou, E Konukoglu - arxiv preprint arxiv:2301.12995, 2023 - arxiv.org
Federated learning is a distributed paradigm that allows multiple parties to collaboratively
train deep models without exchanging the raw data. However, the data distribution among …

Your Transferability Barrier is Fragile: Free-Lunch for Transferring the Non-Transferable Learning

Z Hong, L Shen, T Liu - … of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
Recently non-transferable learning (NTL) was proposed to restrict models' generalization
toward the target domain (s) which serves as state-of-the-art solutions for intellectual …

[PDF][PDF] Noisymix: Boosting robustness by combining data augmentations, stability training, and noise injections

NB Erichson, SH Lim, F Utrera, W Xu… - arxiv preprint arxiv …, 2022 - researchgate.net
For many real-world applications, obtaining stable and robust statistical performance is more
important than simply achieving state-of-the-art predictive test accuracy, and thus robustness …

A unified deep semantic expansion framework for domain-generalized person re-identification

EPW Ang, S Lin, AC Kot - Neurocomputing, 2024 - Elsevier
Abstract Supervised Person Re-identification (Person ReID) methods have achieved
excellent performance when training and testing within one camera network. However, they …

NoisyMix: Boosting model robustness to common corruptions

B Erichson, SH Lim, W Xu, F Utrera… - International …, 2024 - proceedings.mlr.press
The robustness of neural networks has become increasingly important in real-world
applications where stable and reliable performance is valued over simply achieving high …