A framework of constraint preserving update schemes for optimization on Stiefel manifold
This paper considers optimization problems on the Stiefel manifold X^ TX= I_p XTX= I p,
where X ∈ R^ n * p X∈ R n× p is the variable and I_p I p is the p p-by-p p identity matrix. A …
where X ∈ R^ n * p X∈ R n× p is the variable and I_p I p is the p p-by-p p identity matrix. A …
A family of spectral gradient methods for optimization
We propose a family of spectral gradient methods, whose stepsize is determined by a
convex combination of the long Barzilai–Borwein (BB) stepsize and the short BB stepsize …
convex combination of the long Barzilai–Borwein (BB) stepsize and the short BB stepsize …
On the acceleration of the Barzilai–Borwein method
Abstract The Barzilai–Borwein (BB) gradient method is efficient for solving large-scale
unconstrained problems to modest accuracy due to its ingenious stepsize which generally …
unconstrained problems to modest accuracy due to its ingenious stepsize which generally …
Gradient methods exploiting spectral properties
We propose a new stepsize for the gradient method. It is shown that this new stepsize will
converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang's …
converge to the reciprocal of the largest eigenvalue of the Hessian, when Dai-Yang's …
Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization
Y Huang, H Liu - Computational Optimization and Applications, 2016 - Springer
We present a smoothing projected Barzilai–Borwein (SPBB) algorithm for solving a class of
minimization problems on a closed convex set, where the objective function is nonsmooth …
minimization problems on a closed convex set, where the objective function is nonsmooth …
On the asymptotic convergence and acceleration of gradient methods
We consider the asymptotic behavior of a family of gradient methods, which include the
steepest descent and minimal gradient methods as special instances. It is proved that each …
steepest descent and minimal gradient methods as special instances. It is proved that each …
A Note on R-Linear Convergence of Nonmonotone Gradient Methods
XR Li, YK Huang - Journal of the Operations Research Society of China, 2023 - Springer
Nonmonotone gradient methods generally perform better than their monotone counterparts
especially on unconstrained quadratic optimization. However, the known convergence rate …
especially on unconstrained quadratic optimization. However, the known convergence rate …