A high-bias, low-variance introduction to machine learning for physicists
Abstract Machine Learning (ML) is one of the most exciting and dynamic areas of modern
research and application. The purpose of this review is to provide an introduction to the core …
research and application. The purpose of this review is to provide an introduction to the core …
Representations and generalization in artificial and brain neural networks
Humans and animals excel at generalizing from limited data, a capability yet to be fully
replicated in artificial intelligence. This perspective investigates generalization in biological …
replicated in artificial intelligence. This perspective investigates generalization in biological …
Beyond neural scaling laws: beating power law scaling via data pruning
Widely observed neural scaling laws, in which error falls off as a power of the training set
size, model size, or both, have driven substantial performance improvements in deep …
size, model size, or both, have driven substantial performance improvements in deep …
[HTML][HTML] High-dimensional dynamics of generalization error in neural networks
We perform an analysis of the average generalization dynamics of large neural networks
trained using gradient descent. We study the practically-relevant “high-dimensional” regime …
trained using gradient descent. We study the practically-relevant “high-dimensional” regime …
Statistical mechanics of deep learning
The recent striking success of deep neural networks in machine learning raises profound
questions about the theoretical principles underlying their success. For example, what can …
questions about the theoretical principles underlying their success. For example, what can …
Spectrum dependent learning curves in kernel regression and wide neural networks
B Bordelon, A Canatar… - … Conference on Machine …, 2020 - proceedings.mlr.press
We derive analytical expressions for the generalization performance of kernel regression as
a function of the number of training samples using theoretical methods from Gaussian …
a function of the number of training samples using theoretical methods from Gaussian …
Generalisation error in learning with random features and the hidden manifold model
We study generalised linear regression and classification for a synthetically generated
dataset encompassing different problems of interest, such as learning with random features …
dataset encompassing different problems of interest, such as learning with random features …
Double trouble in double descent: Bias and variance (s) in the lazy regime
Deep neural networks can achieve remarkable generalization performances while
interpolating the training data. Rather than the U-curve emblematic of the bias-variance …
interpolating the training data. Rather than the U-curve emblematic of the bias-variance …
On simplicity and complexity in the brave new world of large-scale neuroscience
P Gao, S Ganguli - Current opinion in neurobiology, 2015 - Elsevier
Technological advances have dramatically expanded our ability to probe multi-neuronal
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …
[HTML][HTML] Accurate estimation of neural population dynamics without spike sorting
A central goal of systems neuroscience is to relate an organism's neural activity to behavior.
Neural population analyses often reduce the data dimensionality to focus on relevant activity …
Neural population analyses often reduce the data dimensionality to focus on relevant activity …