RandNLA: randomized numerical linear algebra
RandNLA: randomized numerical linear algebra Page 1 80 COMMUNICATIONS OF THE ACM
| JUNE 2016 | VOL. 59 | NO. 6 review articles DOI:10.1145/2842602 Randomization offers new …
| JUNE 2016 | VOL. 59 | NO. 6 review articles DOI:10.1145/2842602 Randomization offers new …
An introduction to matrix concentration inequalities
JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …
Data-dependent coresets for compressing neural networks with applications to generalization bounds
We present an efficient coresets-based neural network compression algorithm that sparsifies
the parameters of a trained fully-connected neural network in a manner that provably …
the parameters of a trained fully-connected neural network in a manner that provably …
Data-independent neural pruning via coresets
Previous work showed empirically that large neural networks can be significantly reduced in
size while preserving their accuracy. Model compression became a central research topic …
size while preserving their accuracy. Model compression became a central research topic …
An improved classical singular value transformation for quantum machine learning
The field of quantum machine learning (QML) produces many proposals for attaining
quantum speedups for tasks in machine learning and data analysis. Such speedups can …
quantum speedups for tasks in machine learning and data analysis. Such speedups can …
[PDF][PDF] User-friendly tools for random matrices: An introduction
JA Tropp - NIPS Tutorial, 2012 - Citeseer
Nota Bene: This manuscript has not yet reached its final form. In particular, I have not had
the opportunity to check all the details carefully and to polish the writing so that it reflects the …
the opportunity to check all the details carefully and to polish the writing so that it reflects the …
Data-independent structured pruning of neural networks via coresets
Model compression is crucial for the deployment of neural networks on devices with limited
computational and memory resources. Many different methods show comparable accuracy …
computational and memory resources. Many different methods show comparable accuracy …
Detection thresholds in very sparse matrix completion
We study the matrix completion problem: an underlying m× n matrix P is low rank, with
incoherent singular vectors, and a random m× n matrix A is equal to P on a (uniformly) …
incoherent singular vectors, and a random m× n matrix A is equal to P on a (uniformly) …
Near-optimal entrywise sampling of numerically sparse matrices
Many real-world data sets are sparse or almost sparse. One method to measure this for a
matrix $ A\in\mathbb {R}^{n\times n} $ is the\emph {numerical sparsity}, denoted $\mathsf …
matrix $ A\in\mathbb {R}^{n\times n} $ is the\emph {numerical sparsity}, denoted $\mathsf …
Exploiting numerical sparsity for efficient learning: faster eigenvector computation and regression
In this paper, we obtain improved running times for regression and top eigenvector
computation for numerically sparse matrices. Given a data matrix $\mat {A}\in\R^{n\times d} …
computation for numerically sparse matrices. Given a data matrix $\mat {A}\in\R^{n\times d} …