A unified discretization framework for differential equation approach with Lyapunov arguments for convex optimization
The differential equation (DE) approach for convex optimization, which relates optimization
methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained …
methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained …
Continuous vs. discrete optimization of deep neural networks
Existing analyses of optimization in deep learning are either continuous, focusing on
(variants of) gradient flow, or discrete, directly treating (variants of) gradient descent …
(variants of) gradient flow, or discrete, directly treating (variants of) gradient descent …
Continuous-time analysis of accelerated gradient methods via conservation laws in dilated coordinate systems
We analyze continuous-time models of accelerated gradient methods through deriving
conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …
conservation laws in dilated coordinate systems. Namely, instead of analyzing the dynamics …
Infotainment enabled smart cars: A joint communication, caching, and computation approach
Remarkable prevalence of cloud computing has enabled smart cars to provide infotainment
services. However, retrieving infotainment contents from long-distance data centers poses a …
services. However, retrieving infotainment contents from long-distance data centers poses a …
Finite-time convergence in continuous-time optimization
In this paper, we investigate a Lyapunov-like differential inequality that allows us to establish
finite-time stability of a continuous-time state-space dynamical system represented via a …
finite-time stability of a continuous-time state-space dynamical system represented via a …
Accelerated primal-dual methods for linearly constrained convex optimization problems
H Luo - arxiv preprint arxiv:2109.12604, 2021 - arxiv.org
This work proposes an accelerated primal-dual dynamical system for affine constrained
convex optimization and presents a class of primal-dual methods with nonergodic …
convex optimization and presents a class of primal-dual methods with nonergodic …
On dissipative symplectic integration with applications to gradient-based optimization
Recently, continuous-time dynamical systems have proved useful in providing conceptual
and quantitative insights into gradient-based optimization, widely used in modern machine …
and quantitative insights into gradient-based optimization, widely used in modern machine …
Finite-sample analysis of nonlinear stochastic approximation with applications in reinforcement learning
Motivated by applications in reinforcement learning (RL), we study a nonlinear stochastic
approximation (SA) algorithm under Markovian noise, and establish its finite-sample …
approximation (SA) algorithm under Markovian noise, and establish its finite-sample …
Learning-accelerated ADMM for distributed DC optimal power flow
We propose a novel data-driven method to accelerate the convergence of Alternating
Direction Method of Multipliers (ADMM) for solving distributed DC optimal power flow (DC …
Direction Method of Multipliers (ADMM) for solving distributed DC optimal power flow (DC …
Conformal symplectic and relativistic optimization
Arguably, the two most popular accelerated or momentum-based optimization methods are
Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different …
Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different …