Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Accelerated gradient methods for geodesically convex optimization: Tractable algorithms and convergence analysis
We propose computationally tractable accelerated first-order methods for Riemannian
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
optimization, extending the Nesterov accelerated gradient (NAG) method. For both …
Dynamical systems–based neural networks
Neural networks have gained much interest because of their effectiveness in many
applications. However, their mathematical properties are generally not well understood. If …
applications. However, their mathematical properties are generally not well understood. If …
Geometric methods for sampling, optimization, inference, and adaptive agents
In this chapter, we identify fundamental geometric structures that underlie the problems of
sampling, optimization, inference, and adaptive decision-making. Based on this …
sampling, optimization, inference, and adaptive decision-making. Based on this …
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms …
A nonsmooth dynamical systems perspective on accelerated extensions of ADMM
Recently, there has been great interest in connections between continuous-time dynamical
systems and optimization methods, notably in the context of accelerated methods for smooth …
systems and optimization methods, notably in the context of accelerated methods for smooth …
Normalised latent measure factor models
We propose a methodology for modelling and comparing probability distributions within a
Bayesian nonparametric framework. Building on dependent normalised random measures …
Bayesian nonparametric framework. Building on dependent normalised random measures …
Geometry of learning and representation in neural networks
P Sokół - 2023 - search.proquest.com
Theoretical neuroscience has come to face a unique set of opportunities and challenges. By
virtue of being at the nexus of experimental neurobiology and machine learning, theoretical …
virtue of being at the nexus of experimental neurobiology and machine learning, theoretical …
Nesterov acceleration for Riemannian optimization
In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve
Riemannian optimization problems in a computationally tractable manner. The iteration …
Riemannian optimization problems in a computationally tractable manner. The iteration …
Numerical KAM theory and backward error analysis for symplectic methods applied to (quasi-) periodically perturbed Hamiltonian ODE
F Carere - 2022 - studenttheses.uu.nl
Recenly, a model for tidal waves in shallow areas has been reconsidered, previously
studied in the 1990's. The goal was to study mixing and transport due to chaotic motion in …
studied in the 1990's. The goal was to study mixing and transport due to chaotic motion in …