An overview of low-rank matrix recovery from incomplete observations

MA Davenport, J Romberg - IEEE Journal of Selected Topics in …, 2016 - ieeexplore.ieee.org
Low-rank matrices play a fundamental role in modeling and computational methods for
signal processing and machine learning. In many applications where low-rank matrices …

On simplicity and complexity in the brave new world of large-scale neuroscience

P Gao, S Ganguli - Current opinion in neurobiology, 2015 - Elsevier
Technological advances have dramatically expanded our ability to probe multi-neuronal
dynamics and connectivity in the brain. However, our ability to extract a simple conceptual …

Randomized numerical linear algebra: Foundations and algorithms

PG Martinsson, JA Tropp - Acta Numerica, 2020 - cambridge.org
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …

[LIVRE][B] High-dimensional probability: An introduction with applications in data science

R Vershynin - 2018 - books.google.com
High-dimensional probability offers insight into the behavior of random vectors, random
matrices, random subspaces, and objects used to quantify uncertainty in high dimensions …

An introduction to matrix concentration inequalities

JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …

A modern maximum-likelihood theory for high-dimensional logistic regression

P Sur, EJ Candès - Proceedings of the National Academy of …, 2019 - National Acad Sciences
Students in statistics or data science usually learn early on that when the sample size n is
large relative to the number of variables p, fitting a logistic model by the method of maximum …

Universality laws for high-dimensional learning with random features

H Hu, YM Lu - IEEE Transactions on Information Theory, 2022 - ieeexplore.ieee.org
We prove a universality theorem for learning with random features. Our result shows that, in
terms of training and generalization errors, a random feature model with a nonlinear …

Solving inverse problems with deep neural networks–robustness included?

M Genzel, J Macdonald, M März - IEEE transactions on pattern …, 2022 - ieeexplore.ieee.org
In the past five years, deep learning methods have become state-of-the-art in solving various
inverse problems. Before such approaches can find application in safety-critical fields, a …

FRaC: FMCW-based joint radar-communications system via index modulation

D Ma, N Shlezinger, T Huang, Y Liu… - IEEE journal of selected …, 2021 - ieeexplore.ieee.org
Dual function radar communications (DFRC) systems are attractive technologies for
autonomous vehicles, which utilize electromagnetic waves to constantly sense the …

A model of double descent for high-dimensional binary linear classification

Z Deng, A Kammoun… - Information and Inference …, 2022 - academic.oup.com
We consider a model for logistic regression where only a subset of features of size is used
for training a linear classifier over training samples. The classifier is obtained by running …