Last layer re-training is sufficient for robustness to spurious correlations
Neural network classifiers can largely rely on simple spurious features, such as
backgrounds, to make predictions. However, even in these cases, we show that they still …
backgrounds, to make predictions. However, even in these cases, we show that they still …
Improving out-of-distribution robustness via selective augmentation
Abstract Machine learning algorithms typically assume that training and test examples are
drawn from the same distribution. However, distribution shift is a common problem in real …
drawn from the same distribution. However, distribution shift is a common problem in real …
Discover and cure: Concept-aware mitigation of spurious correlation
Deep neural networks often rely on spurious correlations to make predictions, which hinders
generalization beyond training environments. For instance, models that associate cats with …
generalization beyond training environments. For instance, models that associate cats with …
Wild-time: A benchmark of in-the-wild distribution shift over time
Distribution shifts occur when the test distribution differs from the training distribution, and
can considerably degrade performance of machine learning models deployed in the real …
can considerably degrade performance of machine learning models deployed in the real …
Predictive overfitting in immunological applications: Pitfalls and solutions
Overfitting describes the phenomenon where a highly predictive model on the training data
generalizes poorly to future observations. It is a common concern when applying machine …
generalizes poorly to future observations. It is a common concern when applying machine …
DrugOOD: Out-of-Distribution (OOD) Dataset Curator and Benchmark for AI-aided Drug Discovery--A Focus on Affinity Prediction Problems with Noise Annotations
AI-aided drug discovery (AIDD) is gaining increasing popularity due to its promise of making
the search for new pharmaceuticals quicker, cheaper and more efficient. In spite of its …
the search for new pharmaceuticals quicker, cheaper and more efficient. In spite of its …
Domain adaptation under open set label shift
We introduce the problem of domain adaptation under Open Set Label Shift (OSLS), where
the label distribution can change arbitrarily and a new class may arrive during deployment …
the label distribution can change arbitrarily and a new class may arrive during deployment …
Brave the wind and the waves: Discovering robust and generalizable graph lottery tickets
The training and inference of Graph Neural Networks (GNNs) are costly when scaling up to
large-scale graphs. Graph Lottery Ticket (GLT) has presented the first attempt to accelerate …
large-scale graphs. Graph Lottery Ticket (GLT) has presented the first attempt to accelerate …
Benchmarking distribution shift in tabular data with tableshift
Robustness to distribution shift has become a growing concern for text and image models as
they transition from research subjects to deployment in the real world. However, high-quality …
they transition from research subjects to deployment in the real world. However, high-quality …
Robust learning with progressive data expansion against spurious correlation
While deep learning models have shown remarkable performance in various tasks, they are
susceptible to learning non-generalizable _spurious features_ rather than the core features …
susceptible to learning non-generalizable _spurious features_ rather than the core features …