Directional smoothness and gradient methods: Convergence and adaptivity

A Mishkin, A Khaled, Y Wang… - Advances in Neural …, 2025 - proceedings.neurips.cc
We develop new sub-optimality bounds for gradient descent (GD) that depend on the
conditioning of the objective along the path of optimization, rather than on global, worst-case …

Glocal Smoothness: Line Search can really help!

C Fox, M Schmidt - OPT 2024: Optimization for Machine Learning - openreview.net
Iteration complexities are bounds on the number of iterations of an algorithm. Iteration
complexities for first-order numerical optimization algorithms are typically stated in terms of a …