A survey on deep matrix factorizations
Constrained low-rank matrix approximations have been known for decades as powerful
linear dimensionality reduction techniques able to extract the information contained in large …
linear dimensionality reduction techniques able to extract the information contained in large …
Data-driven cardiovascular flow modelling: examples and opportunities
High-fidelity blood flow modelling is crucial for enhancing our understanding of
cardiovascular disease. Despite significant advances in computational and experimental …
cardiovascular disease. Despite significant advances in computational and experimental …
Scatterbrain: Unifying sparse and low-rank attention
Recent advances in efficient Transformers have exploited either the sparsity or low-rank
properties of attention matrices to reduce the computational and memory bottlenecks of …
properties of attention matrices to reduce the computational and memory bottlenecks of …
[BOG][B] Numerical linear algebra
LN Trefethen, D Bau - 2022 - SIAM
Since the early 1980. the first author has taught a graduate course in numerical linear
algebra at MIT and Cornell. The alumni of this course, now numbering in the hundreds, have …
algebra at MIT and Cornell. The alumni of this course, now numbering in the hundreds, have …
[BOG][B] Numerical methods for least squares problems
Å Björck - 2024 - SIAM
Excerpt More than 25 years have passed since the first edition of this book was published in
1996. Least squares and least-norm problems have become more significant with every …
1996. Least squares and least-norm problems have become more significant with every …
Pixelated butterfly: Simple and efficient sparse training for neural network models
Overparameterized neural networks generalize well but are expensive to train. Ideally, one
would like to reduce their computational cost while retaining their generalization benefits …
would like to reduce their computational cost while retaining their generalization benefits …
The why and how of nonnegative matrix factorization
N Gillis - … , optimization, kernels, and support vector machines, 2014 - books.google.com
Nonnegative matrix factorization (NMF) has become a widely used tool for the analysis of
high-dimensional data as it automatically extracts sparse and meaningful features from a set …
high-dimensional data as it automatically extracts sparse and meaningful features from a set …
Jfb: Jacobian-free backpropagation for implicit networks
A promising trend in deep learning replaces traditional feedforward networks with implicit
networks. Unlike traditional networks, implicit networks solve a fixed point equation to …
networks. Unlike traditional networks, implicit networks solve a fixed point equation to …
Missing data imputation using optimal transport
Missing data is a crucial issue when applying machine learning algorithms to real-world
datasets. Starting from the simple assumption that two batches extracted randomly from the …
datasets. Starting from the simple assumption that two batches extracted randomly from the …
Desiderata for representation learning: A causal perspective
Representation learning constructs low-dimensional representations to summarize essential
features of high-dimensional data. This learning problem is often approached by describing …
features of high-dimensional data. This learning problem is often approached by describing …