Deep neural networks tend to extrapolate predictably
Conventional wisdom suggests that neural network predictions tend to be unpredictable and
overconfident when faced with out-of-distribution (OOD) inputs. Our work reassesses this …
overconfident when faced with out-of-distribution (OOD) inputs. Our work reassesses this …
Learning Low Dimensional State Spaces with Overparameterized Recurrent Neural Nets
Overparameterization in deep learning typically refers to settings where a trained neural
network (NN) has representational capacity to fit the training data in many ways, some of …
network (NN) has representational capacity to fit the training data in many ways, some of …
The Implicit Bias of Structured State Space Models Can Be Poisoned With Clean Labels
Neural networks are powered by an implicit bias: a tendency of gradient descent to fit
training data in a way that generalizes to unseen data. A recent class of neural network …
training data in a way that generalizes to unseen data. A recent class of neural network …