Last layer re-training is sufficient for robustness to spurious correlations
Neural network classifiers can largely rely on simple spurious features, such as
backgrounds, to make predictions. However, even in these cases, we show that they still …
backgrounds, to make predictions. However, even in these cases, we show that they still …
Improving out-of-distribution robustness via selective augmentation
Abstract Machine learning algorithms typically assume that training and test examples are
drawn from the same distribution. However, distribution shift is a common problem in real …
drawn from the same distribution. However, distribution shift is a common problem in real …
On feature learning in the presence of spurious correlations
Deep classifiers are known to rely on spurious features—patterns which are correlated with
the target on the training data but not inherently relevant to the learning problem, such as the …
the target on the training data but not inherently relevant to the learning problem, such as the …
Learning causally invariant representations for out-of-distribution generalization on graphs
Despite recent success in using the invariance principle for out-of-distribution (OOD)
generalization on Euclidean data (eg, images), studies on graph data are still limited …
generalization on Euclidean data (eg, images), studies on graph data are still limited …
Discover and cure: Concept-aware mitigation of spurious correlation
Deep neural networks often rely on spurious correlations to make predictions, which hinders
generalization beyond training environments. For instance, models that associate cats with …
generalization beyond training environments. For instance, models that associate cats with …
Towards last-layer retraining for group robustness with fewer annotations
T LaBonte, V Muthukumar… - Advances in Neural …, 2023 - proceedings.neurips.cc
Empirical risk minimization (ERM) of neural networks is prone to over-reliance on spurious
correlations and poor generalization on minority groups. The recent deep feature …
correlations and poor generalization on minority groups. The recent deep feature …
Spurious correlations in machine learning: A survey
Machine learning systems are known to be sensitive to spurious correlations between non-
essential features of the inputs (eg, background, texture, and secondary objects) and the …
essential features of the inputs (eg, background, texture, and secondary objects) and the …
Simple and fast group robustness by automatic feature reweighting
A major challenge to out-of-distribution generalization is reliance on spurious features—
patterns that are predictive of the class label in the training data distribution, but not causally …
patterns that are predictive of the class label in the training data distribution, but not causally …
Out-of-distribution generalization on graphs: A survey
Graph machine learning has been extensively studied in both academia and industry.
Although booming with a vast number of emerging methods and techniques, most of the …
Although booming with a vast number of emerging methods and techniques, most of the …
Mind the label shift of augmentation-based graph ood generalization
Abstract Out-of-distribution (OOD) generalization is an important issue for Graph Neural
Networks (GNNs). Recent works employ different graph editions to generate augmented …
Networks (GNNs). Recent works employ different graph editions to generate augmented …