RandNLA: randomized numerical linear algebra

P Drineas, MW Mahoney - Communications of the ACM, 2016 - dl.acm.org
RandNLA: randomized numerical linear algebra Page 1 80 COMMUNICATIONS OF THE ACM
| JUNE 2016 | VOL. 59 | NO. 6 review articles DOI:10.1145/2842602 Randomization offers new …

An introduction to matrix concentration inequalities

JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …

Frequent directions: Simple and deterministic matrix sketching

M Ghashami, E Liberty, JM Phillips… - SIAM Journal on …, 2016 - SIAM
We describe a new algorithm called FrequentDirections for deterministic matrix sketching in
the row-update model. The algorithm is presented an arbitrary input matrix A ∈ R^ n * d one …

Preconditioned data sparsification for big data with applications to PCA and K-means

F Pourkamali-Anaraki, S Becker - IEEE Transactions on …, 2017 - ieeexplore.ieee.org
We analyze a compression scheme for large data sets that randomly keeps a small
percentage of the components of each data sample. The benefit is that the output is a sparse …

Importance sparsification for sinkhorn algorithm

M Li, J Yu, T Li, C Meng - Journal of Machine Learning Research, 2023 - jmlr.org
Sinkhorn algorithm has been used pervasively to approximate the solution to optimal
transport (OT) and unbalanced optimal transport (UOT) problems. However, its practical …

Even sparser graph transformers

H Shirzad, H Lin, B Venkatachalam, A Velingker… - arxiv preprint arxiv …, 2024 - arxiv.org
Graph Transformers excel in long-range dependency modeling, but generally require
quadratic memory complexity in the number of nodes in an input graph, and hence have …

Universal matrix sparsifiers and fast deterministic algorithms for linear algebra

R Bhattacharjee, G Dexter, C Musco, A Ray… - arxiv preprint arxiv …, 2023 - arxiv.org
Let $\mathbf S\in\mathbb R^{n\times n} $ satisfy $\|\mathbf 1-\mathbf S\| _2\le\epsilon n $,
where $\mathbf 1$ is the all ones matrix and $\|\cdot\| _2 $ is the spectral norm. It is well …

Tensor sparsification via a bound on the spectral norm of random tensors

NH Nguyen, P Drineas, TD Tran - Information and Inference: A …, 2015 - academic.oup.com
Given an order-tensor, we present a simple, element-wise sparsification algorithm that
zeroes out all sufficiently small elements of, keeps all sufficiently large elements of and …

Survey of approaches to generate realistic synthetic graphs

SH Lim, S Lee, SS Powers, M Shankar, N Imam - 2016 - osti.gov
A graph is a flexible data structure that can represent relationships between entities. As with
other data analysis tasks, the use of realistic graphs is critical to obtaining valid research …

Provably correct algorithms for matrix column subset selection with selectively sampled data

Y Wang, A Singh - Journal of Machine Learning Research, 2018 - jmlr.org
We consider the problem of matrix column subset selection, which selects a subset of
columns from an input matrix such that the input can be well approximated by the span of the …