An overview of low-rank matrix recovery from incomplete observations
Low-rank matrices play a fundamental role in modeling and computational methods for
signal processing and machine learning. In many applications where low-rank matrices …
signal processing and machine learning. In many applications where low-rank matrices …
On simplicity and complexity in the brave new world of large-scale neuroscience
P Gao, S Ganguli - Current opinion in neurobiology, 2015 - Elsevier
Technological advances have dramatically expanded our ability to probe multi-neuronal
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …
Randomized numerical linear algebra: Foundations and algorithms
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …
[LIVRE][B] High-dimensional probability: An introduction with applications in data science
R Vershynin - 2018 - books.google.com
High-dimensional probability offers insight into the behavior of random vectors, random
matrices, random subspaces, and objects used to quantify uncertainty in high dimensions …
matrices, random subspaces, and objects used to quantify uncertainty in high dimensions …
An introduction to matrix concentration inequalities
JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
A modern maximum-likelihood theory for high-dimensional logistic regression
Students in statistics or data science usually learn early on that when the sample size n is
large relative to the number of variables p, fitting a logistic model by the method of maximum …
large relative to the number of variables p, fitting a logistic model by the method of maximum …
Universality laws for high-dimensional learning with random features
We prove a universality theorem for learning with random features. Our result shows that, in
terms of training and generalization errors, a random feature model with a nonlinear …
terms of training and generalization errors, a random feature model with a nonlinear …
Solving inverse problems with deep neural networks–robustness included?
In the past five years, deep learning methods have become state-of-the-art in solving various
inverse problems. Before such approaches can find application in safety-critical fields, a …
inverse problems. Before such approaches can find application in safety-critical fields, a …
FRaC: FMCW-based joint radar-communications system via index modulation
Dual function radar communications (DFRC) systems are attractive technologies for
autonomous vehicles, which utilize electromagnetic waves to constantly sense the …
autonomous vehicles, which utilize electromagnetic waves to constantly sense the …
A model of double descent for high-dimensional binary linear classification
We consider a model for logistic regression where only a subset of features of size is used
for training a linear classifier over training samples. The classifier is obtained by running …
for training a linear classifier over training samples. The classifier is obtained by running …