Decision-focused learning: Foundations, state of the art, benchmark and future opportunities
Decision-focused learning (DFL) is an emerging paradigm that integrates machine learning
(ML) and constrained optimization to enhance decision quality by training ML models in an …
(ML) and constrained optimization to enhance decision quality by training ML models in an …
[HTML][HTML] Distributionally robust optimization: A review on theory and applications
In this paper, we survey the primary research on the theory and applications of
distributionally robust optimization (DRO). We start with reviewing the modeling power and …
distributionally robust optimization (DRO). We start with reviewing the modeling power and …
Towards out-of-distribution generalization: A survey
Traditional machine learning paradigms are based on the assumption that both training and
test data follow the same statistical pattern, which is mathematically referred to as …
test data follow the same statistical pattern, which is mathematically referred to as …
Discover and cure: Concept-aware mitigation of spurious correlation
Deep neural networks often rely on spurious correlations to make predictions, which hinders
generalization beyond training environments. For instance, models that associate cats with …
generalization beyond training environments. For instance, models that associate cats with …
On the need for a language describing distribution shifts: Illustrations on tabular datasets
Different distribution shifts require different algorithmic and operational interventions.
Methodological research must be grounded by the specific shifts they address. Although …
Methodological research must be grounded by the specific shifts they address. Although …
No subclass left behind: Fine-grained robustness in coarse-grained classification problems
In real-world classification tasks, each class often comprises multiple finer-grained"
subclasses." As the subclass labels are frequently unavailable, models trained using only …
subclasses." As the subclass labels are frequently unavailable, models trained using only …
Dog is sgd's best friend: A parameter-free dynamic step size schedule
We propose a tuning-free dynamic SGD step size formula, which we call Distance over
Gradients (DoG). The DoG step sizes depend on simple empirical quantities (distance from …
Gradients (DoG). The DoG step sizes depend on simple empirical quantities (distance from …
Correct-n-contrast: A contrastive approach for improving robustness to spurious correlations
Spurious correlations pose a major challenge for robust machine learning. Models trained
with empirical risk minimization (ERM) may learn to rely on correlations between class …
with empirical risk minimization (ERM) may learn to rely on correlations between class …
Spurious correlations in machine learning: A survey
Machine learning systems are known to be sensitive to spurious correlations between non-
essential features of the inputs (eg, background, texture, and secondary objects) and the …
essential features of the inputs (eg, background, texture, and secondary objects) and the …
Preserving fairness generalization in deepfake detection
Although effective deepfake detection models have been developed in recent years recent
studies have revealed that these models can result in unfair performance disparities among …
studies have revealed that these models can result in unfair performance disparities among …