Tolerant algorithms for learning with arbitrary covariate shift

S Goel, A Shetty, K Stavropoulos… - Advances in Neural …, 2025‏ - proceedings.neurips.cc
We study the problem of learning under arbitrary distribution shift, where the learner is
trained on a labeled set from one distribution but evaluated on a different, potentially …

Efficient discrepancy testing for learning with distribution shift

G Chandrasekaran, A Klivans… - Advances in …, 2025‏ - proceedings.neurips.cc
A fundamental notion of distance between train and test distributions from the field of domain
adaptation is discrepancy distance. While in general hard to compute, here we provide the …

Learning Neural Networks with Distribution Shift: Efficiently Certifiable Guarantees

G Chandrasekaran, AR Klivans, LL Lee… - arxiv preprint arxiv …, 2025‏ - arxiv.org
We give the first provably efficient algorithms for learning neural networks with distribution
shift. We work in the Testable Learning with Distribution Shift framework (TDS learning) of …

A duality framework for analyzing random feature and two-layer neural networks

H Chen, J Long, L Wu - arxiv preprint arxiv:2305.05642, 2023‏ - arxiv.org
We consider the problem of learning functions within the $\mathcal {F} _ {p,\pi} $ and Barron
spaces, which play crucial roles in understanding random feature models (RFMs), two-layer …