An overview of low-rank matrix recovery from incomplete observations
Low-rank matrices play a fundamental role in modeling and computational methods for
signal processing and machine learning. In many applications where low-rank matrices …
signal processing and machine learning. In many applications where low-rank matrices …
Complete dictionary recovery over the sphere I: Overview and the geometric picture
We consider the problem of recovering a complete (ie, square and invertible) matrix A 0,
from Y∈ R n× p with Y= A 0 X 0, provided X 0 is sufficiently sparse. This recovery problem is …
from Y∈ R n× p with Y= A 0 X 0, provided X 0 is sufficiently sparse. This recovery problem is …
Randomized numerical linear algebra: Foundations and algorithms
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …
Direction-of-arrival estimation for coprime array via virtual array interpolation
Coprime arrays can achieve an increased number of degrees of freedom by deriving the
equivalent signals of a virtual array. However, most existing methods fail to utilize all …
equivalent signals of a virtual array. However, most existing methods fail to utilize all …
[ΒΙΒΛΙΟ][B] High-dimensional probability: An introduction with applications in data science
R Vershynin - 2018 - books.google.com
High-dimensional probability offers insight into the behavior of random vectors, random
matrices, random subspaces, and objects used to quantify uncertainty in high dimensions …
matrices, random subspaces, and objects used to quantify uncertainty in high dimensions …
Scalable methods for 8-bit training of neural networks
Abstract Quantized Neural Networks (QNNs) are often used to improve network efficiency
during the inference phase, ie after the network has been trained. Extensive research in the …
during the inference phase, ie after the network has been trained. Extensive research in the …
Atomo: Communication-efficient learning via atomic sparsification
Distributed model training suffers from communication overheads due to frequent gradient
updates transmitted between compute nodes. To mitigate these overheads, several studies …
updates transmitted between compute nodes. To mitigate these overheads, several studies …
[ΒΙΒΛΙΟ][B] Introduction to uncertainty quantification
TJ Sullivan - 2015 - books.google.com
This text provides a framework in which the main objectives of the field of uncertainty
quantification (UQ) are defined and an overview of the range of mathematical methods by …
quantification (UQ) are defined and an overview of the range of mathematical methods by …
An introduction to matrix concentration inequalities
JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
Learning with differentiable pertubed optimizers
Abstract Machine learning pipelines often rely on optimizers procedures to make discrete
decisions (eg, sorting, picking closest neighbors, or shortest paths). Although these discrete …
decisions (eg, sorting, picking closest neighbors, or shortest paths). Although these discrete …