Randomized numerical linear algebra: A perspective on the field with an eye to software

R Murray, J Demmel, MW Mahoney… - arxiv preprint arxiv …, 2023 - arxiv.org
Randomized numerical linear algebra-RandNLA, for short-concerns the use of
randomization as a resource to develop improved algorithms for large-scale linear algebra …

Low-rank approximation and regression in input sparsity time

KL Clarkson, DP Woodruff - Journal of the ACM (JACM), 2017 - dl.acm.org
We design a new distribution over m× n matrices S so that, for any fixed n× d matrix A of rank
r, with probability at least 9/10,∥ SAx∥ 2=(1±ε)∥ Ax∥ 2 simultaneously for all x∈ R d …

Revisiting the Nyström method for improved large-scale machine learning

A Gittens, MW Mahoney - The Journal of Machine Learning Research, 2016 - dl.acm.org
We reconsider randomized algorithms for the low-rank approximation of symmetric positive
semi-definite (SPSD) matrices such as Laplacian and kernel matrices that arise in data …

Practical sketching algorithms for low-rank matrix approximation

JA Tropp, A Yurtsever, M Udell, V Cevher - SIAM Journal on Matrix Analysis …, 2017 - SIAM
This paper describes a suite of algorithms for constructing low-rank approximations of an
input matrix from a random linear image, or sketch, of the matrix. These methods can …

Paved with good intentions: analysis of a randomized block Kaczmarz method

D Needell, JA Tropp - Linear Algebra and its Applications, 2014 - Elsevier
The block Kaczmarz method is an iterative scheme for solving overdetermined least-squares
problems. At each step, the algorithm projects the current iterate onto the solution space of a …

Hilbert space methods for reduced-rank Gaussian process regression

A Solin, S Särkkä - Statistics and Computing, 2020 - Springer
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The
method is based on an approximate series expansion of the covariance function in terms of …

Randomized sketches for kernels: Fast and optimal nonparametric regression

Y Yang, M Pilanci, MJ Wainwright - 2017 - projecteuclid.org
Kernel ridge regression (KRR) is a standard method for performing nonparametric
regression over reproducing kernel Hilbert spaces. Given n samples, the time and space …

Optimal CUR matrix decompositions

C Boutsidis, DP Woodruff - Proceedings of the forty-sixth annual ACM …, 2014 - dl.acm.org
The CUR decomposition of an m× n matrix A finds an m× c matrix C with a small subset of c<
n columns of A, together with an r× n matrix R with a small subset of r< m rows of A, as well …

Simpler is better: a comparative study of randomized pivoting algorithms for CUR and interpolative decompositions

Y Dong, PG Martinsson - Advances in Computational Mathematics, 2023 - Springer
Matrix skeletonizations like the interpolative and CUR decompositions provide a framework
for low-rank approximation in which subsets of a given matrix's columns and/or rows are …

Low-rank tucker approximation of a tensor from streaming data

Y Sun, Y Guo, C Luo, J Tropp, M Udell - SIAM Journal on Mathematics of Data …, 2020 - SIAM
This paper describes a new algorithm for computing a low-Tucker-rank approximation of a
tensor. The method applies a randomized linear map to the tensor to obtain a sketch that …