Survey: Image mixing and deleting for data augmentation
Neural networks are prone to overfitting and memorizing data patterns. To avoid over-fitting
and enhance their generalization and performance, various methods have been suggested …
and enhance their generalization and performance, various methods have been suggested …
A survey of mix-based data augmentation: Taxonomy, methods, applications, and explainability
Data augmentation (DA) is indispensable in modern machine learning and deep neural
networks. The basic idea of DA is to construct new training data to improve the model's …
networks. The basic idea of DA is to construct new training data to improve the model's …
Robust heterogeneous federated learning under data corruption
Abstract Model heterogeneous federated learning is a realistic and challenging problem.
However, due to the limitations of data collection, storage, and transmission conditions, as …
However, due to the limitations of data collection, storage, and transmission conditions, as …
Omg: Towards effective graph classification against label noise
Graph classification is a fundamental problem with diverse applications in bioinformatics
and chemistry. Due to the intricate procedures of manual annotations in graphical domains …
and chemistry. Due to the intricate procedures of manual annotations in graphical domains …
Dart: Diversify-aggregate-repeat training improves generalization of neural networks
Abstract Generalization of Neural Networks is crucial for deploying them safely in the real
world. Common training strategies to improve generalization involve the use of data …
world. Common training strategies to improve generalization involve the use of data …
Fedfa: Federated feature augmentation
Federated learning is a distributed paradigm that allows multiple parties to collaboratively
train deep models without exchanging the raw data. However, the data distribution among …
train deep models without exchanging the raw data. However, the data distribution among …
Your Transferability Barrier is Fragile: Free-Lunch for Transferring the Non-Transferable Learning
Recently non-transferable learning (NTL) was proposed to restrict models' generalization
toward the target domain (s) which serves as state-of-the-art solutions for intellectual …
toward the target domain (s) which serves as state-of-the-art solutions for intellectual …
[PDF][PDF] Noisymix: Boosting robustness by combining data augmentations, stability training, and noise injections
For many real-world applications, obtaining stable and robust statistical performance is more
important than simply achieving state-of-the-art predictive test accuracy, and thus robustness …
important than simply achieving state-of-the-art predictive test accuracy, and thus robustness …
A unified deep semantic expansion framework for domain-generalized person re-identification
Abstract Supervised Person Re-identification (Person ReID) methods have achieved
excellent performance when training and testing within one camera network. However, they …
excellent performance when training and testing within one camera network. However, they …
NoisyMix: Boosting model robustness to common corruptions
The robustness of neural networks has become increasingly important in real-world
applications where stable and reliable performance is valued over simply achieving high …
applications where stable and reliable performance is valued over simply achieving high …