Approximation and sampling of multivariate probability distributions in the tensor train decomposition

S Dolgov, K Anaya-Izquierdo, C Fox… - Statistics and Computing, 2020 - Springer
General multivariate distributions are notoriously expensive to sample from, particularly the
high-dimensional posterior distributions in PDE-constrained inverse problems. This paper …

Deep importance sampling using tensor trains with application to a priori and a posteriori rare events

T Cui, S Dolgov, R Scheichl - SIAM Journal on Scientific Computing, 2024 - SIAM
We propose a deep importance sampling method that is suitable for estimating rare event
probabilities in high-dimensional problems. We approximate the optimal importance …

Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction

T Cui, S Dolgov, O Zahm - Journal of Computational Physics, 2023 - Elsevier
We present a novel offline-online method to mitigate the computational burden of the
characterization of posterior random variables in statistical learning. In the offline phase, the …

Rank bounds for approximating gaussian densities in the tensor-train format

PB Rohrbach, S Dolgov, L Grasedyck… - SIAM/ASA Journal on …, 2022 - SIAM
Low-rank tensor approximations have shown great potential for uncertainty quantification in
high dimensions, for example, to build surrogate models that can be used to speed up large …

Adaptive stochastic Galerkin FEM for lognormal coefficients in hierarchical tensor representations

M Eigel, M Marschall, M Pfeffer, R Schneider - Numerische Mathematik, 2020 - Springer
Stochastic Galerkin methods for non-affine coefficient representations are known to cause
major difficulties from theoretical and numerical points of view. In this work, an adaptive …

Generalized self-concordant analysis of Frank–Wolfe algorithms

P Dvurechensky, K Safin, S Shtern… - Mathematical Programming, 2023 - Springer
Projection-free optimization via different variants of the Frank–Wolfe method has become
one of the cornerstones of large scale optimization for machine learning and computational …

Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

M Eigel, R Gruhlke, M Marschall - Statistics and Computing, 2022 - Springer
This paper presents a novel method for the accurate functional approximation of possibly
highly concentrated probability densities. It is based on the combination of several modern …

Multilevel adaptive sparse Leja approximations for Bayesian inverse problems

IG Farcas, J Latz, E Ullmann, T Neckel… - SIAM Journal on Scientific …, 2020 - SIAM
Deterministic interpolation and quadrature methods are often unsuitable to address
Bayesian inverse problems depending on computationally expensive forward mathematical …

Bayesian inversion for electromyography using low-rank tensor formats

A Rörich, TA Werthmann, D Göddeke… - Inverse …, 2021 - iopscience.iop.org
The reconstruction of the structure of biological tissue using electromyographic (EMG) data
is a non-invasive imaging method with diverse medical applications. Mathematically, this …

Learning high-dimensional probability distributions using tree tensor networks

E Grelier, A Nouy, R Lebrun - arxiv preprint arxiv:1912.07913, 2019 - arxiv.org
We consider the problem of the estimation of a high-dimensional probability distribution from
iid samples of the distribution using model classes of functions in tree-based tensor formats …