Randomized numerical linear algebra: Foundations and algorithms

PG Martinsson, JA Tropp - Acta Numerica, 2020 - cambridge.org
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …

Gradients without backpropagation

AG Baydin, BA Pearlmutter, D Syme, F Wood… - arxiv preprint arxiv …, 2022 - arxiv.org
Using backpropagation to compute gradients of objective functions for optimization has
remained a mainstay of machine learning. Backpropagation, or reverse-mode differentiation …

A Maxwell's equations based deep learning method for time domain electromagnetic simulations

P Zhang, Y Hu, Y **, S Deng, X Wu… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
In this paper, we discuss an unsupervised deep learning (DL) method for solving time
domain electromagnetic simulations. Compared to the conventional approach, our method …

Stochastic quasi-gradient methods: Variance reduction via Jacobian sketching

RM Gower, P Richtárik, F Bach - Mathematical Programming, 2021 - Springer
We develop a new family of variance reduced stochastic gradient descent methods for
minimizing the average of a very large number of smooth functions. Our method—JacSketch …

SEGA: Variance reduction via gradient sketching

F Hanzely, K Mishchenko… - Advances in Neural …, 2018 - proceedings.neurips.cc
We propose a novel randomized first order optimization method---SEGA (SkEtched GrAdient
method)---which progressively throughout its iterations builds a variance-reduced estimate …

Recent and upcoming developments in randomized numerical linear algebra for machine learning

M Dereziński, MW Mahoney - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Large matrices arise in many machine learning and data analysis applications, including as
representations of datasets, graphs, model weights, and first and second-order derivatives …

Stochastic reformulations of linear systems: algorithms and convergence theory

P Richtárik, M Takác - SIAM Journal on Matrix Analysis and Applications, 2020 - SIAM
We develop a family of reformulations of an arbitrary consistent linear system into a
stochastic problem. The reformulations are governed by two user-defined parameters: a …

Stochastic subspace cubic Newton method

F Hanzely, N Doikov, Y Nesterov… - … on Machine Learning, 2020 - proceedings.mlr.press
In this paper, we propose a new randomized second-order optimization algorithm—
Stochastic Subspace Cubic Newton (SSCN)—for minimizing a high dimensional convex …

Fast and furious convergence: Stochastic second order methods under interpolation

SY Meng, S Vaswani, IH Laradji… - International …, 2020 - proceedings.mlr.press
We consider stochastic second-order methods for minimizing smooth and strongly-convex
functions under an interpolation condition satisfied by over-parameterized models. Under …

Accelerated decentralized optimization with local updates for smooth and strongly convex objectives

H Hendrikx, F Bach… - The 22nd International …, 2019 - proceedings.mlr.press
In this paper, we study the problem of minimizing a sum of smooth and strongly convex
functions split over the nodes of a network in a decentralized fashion. We propose the …