Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis

J Kim, I Yang - International Conference on Machine …, 2022 - proceedings.mlr.press
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …

Dynamical systems–based neural networks

E Celledoni, D Murari, B Owren, CB Schönlieb… - SIAM Journal on …, 2023 - SIAM
Neural networks have gained much interest because of their effectiveness in many
applications. However, their mathematical properties are generally not well understood. If …

Geometric methods for sampling, optimization, inference, and adaptive agents

A Barp, L Da Costa, G França, K Friston, M Girolami… - Handbook of …, 2022 - Elsevier
In this chapter, we identify fundamental geometric structures that underlie the problems of
sampling, optimization, inference, and adaptive decision-making. Based on this …

Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles

C Criscitiello, N Boumal - Conference on Learning Theory, 2022 - proceedings.mlr.press
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …

Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles

C Criscitiello, N Boumal - arxiv preprint arxiv:2111.13263, 2021 - arxiv.org
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …

A nonsmooth dynamical systems perspective on accelerated extensions of ADMM

G França, DP Robinson, R Vidal - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Recently, there has been great interest in connections between continuous-time dynamical
systems and optimization methods, notably in the context of accelerated methods for smooth …

Normalised latent measure factor models

M Beraha, JE Griffin - Journal of the Royal Statistical Society …, 2023 - academic.oup.com
We propose a methodology for modelling and comparing probability distributions within a
Bayesian nonparametric framework. Building on dependent normalised random measures …

Geometry of learning and representation in neural networks

P Sokół - 2023 - search.proquest.com
Theoretical neuroscience has come to face a unique set of opportunities and challenges. By
virtue of being at the nexus of experimental neurobiology and machine learning, theoretical …

Nesterov acceleration for Riemannian optimization

J Kim, I Yang - arxiv preprint arxiv:2202.02036, 2022 - arxiv.org
In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve
Riemannian optimization problems in a computationally tractable manner. The iteration …

Numerical KAM theory and backward error analysis for symplectic methods applied to (quasi-) periodically perturbed Hamiltonian ODE

F Carere - 2022 - studenttheses.uu.nl
Recenly, a model for tidal waves in shallow areas has been reconsidered, previously
studied in the 1990's. The goal was to study mixing and transport due to chaotic motion in …