Sampling via gradient flows in the space of probability measures

Y Chen, DZ Huang, J Huang, S Reich… - arxiv preprint arxiv …, 2023 - arxiv.org
Sampling a target probability distribution with an unknown normalization constant is a
fundamental challenge in computational science and engineering. Recent work shows that …

Gradient flows for sampling: mean-field models, Gaussian approximations and affine invariance

Y Chen, DZ Huang, J Huang, S Reich… - arxiv preprint arxiv …, 2023 - arxiv.org
Sampling a probability distribution with an unknown normalization constant is a fundamental
problem in computational science and engineering. This task may be cast as an optimization …

Mirror and preconditioned gradient descent in wasserstein space

C Bonet, T Uscidda, A David… - arxiv preprint arxiv …, 2024 - arxiv.org
As the problem of minimizing functionals on the Wasserstein space encompasses many
applications in machine learning, different optimization algorithms on $\mathbb {R}^ d …

Constrained consensus-based optimization and numerical heuristics for the few particle regime

J Beddrich, E Chenchene, M Fornasier… - arxiv preprint arxiv …, 2024 - arxiv.org
Consensus-based optimization (CBO) is a versatile multi-particle optimization method for
performing nonconvex and nonsmooth global optimizations in high dimensions. Proofs of …

Information geometric regularization of the barotropic Euler equation

R Cao, F Schäfer - arxiv preprint arxiv:2308.14127, 2023 - arxiv.org
A key numerical difficulty in compressible fluid dynamics is the formation of shock waves.
Shock waves feature jump discontinuities in the velocity and density of the fluid and thus …

A Unified Perspective on the Dynamics of Deep Transformers

V Castin, P Ablin, JA Carrillo, G Peyré - arxiv preprint arxiv:2501.18322, 2025 - arxiv.org
Transformers, which are state-of-the-art in most machine learning tasks, represent the data
as sequences of vectors called tokens. This representation is then exploited by the attention …

Stable Derivative Free Gaussian Mixture Variational Inference for Bayesian Inverse Problems

B Che, Y Chen, Z Huan, DZ Huang, W Wang - arxiv preprint arxiv …, 2025 - arxiv.org
This paper is concerned with the approximation of probability distributions known up to
normalization constants, with a focus on Bayesian inference for large-scale inverse …

Gradient Flows and Riemannian Structure in the Gromov-Wasserstein Geometry

Z Zhang, Z Goldfeld, K Greenewald, Y Mroueh… - arxiv preprint arxiv …, 2024 - arxiv.org
The Wasserstein space of probability measures is known for its intricate Riemannian
structure, which underpins the Wasserstein geometry and enables gradient flow algorithms …