A survey of recent advances in optimization methods for wireless communications

YF Liu, TH Chang, M Hong, Z Wu… - IEEE Journal on …, 2024 - ieeexplore.ieee.org
Mathematical optimization is now widely regarded as an indispensable modeling and
solution tool for the design of wireless communications systems. While optimization has …

Linear convergence of gradient and proximal-gradient methods under the polyak-łojasiewicz condition

H Karimi, J Nutini, M Schmidt - … Conference, ECML PKDD 2016, Riva del …, 2016 - Springer
In 1963, Polyak proposed a simple condition that is sufficient to show a global linear
convergence rate for gradient descent. This condition is a special case of the Łojasiewicz …

Acceleration methods

A d'Aspremont, D Scieur, A Taylor - Foundations and Trends® …, 2021 - nowpublishers.com
This monograph covers some recent advances in a range of acceleration techniques
frequently used in convex optimization. We first use quadratic optimization problems to …

Error bounds, quadratic growth, and linear convergence of proximal methods

D Drusvyatskiy, AS Lewis - Mathematics of Operations …, 2018 - pubsonline.informs.org
The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex
function often converges linearly even without strong convexity. One common reason is that …

Linear convergence of first order methods for non-strongly convex optimization

I Necoara, Y Nesterov, F Glineur - Mathematical Programming, 2019 - Springer
The standard assumption for proving linear convergence of first order methods for smooth
convex optimization is the strong convexity of the objective function, an assumption which …

Calculus of the exponent of Kurdyka–Łojasiewicz inequality and its applications to linear convergence of first-order methods

G Li, TK Pong - Foundations of computational mathematics, 2018 - Springer
In this paper, we study the Kurdyka–Łojasiewicz (KL) exponent, an important quantity for
analyzing the convergence rate of first-order methods. Specifically, we develop various …

A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems

X Li, D Sun, KC Toh - SIAM Journal on Optimization, 2018 - SIAM
We develop a fast and robust algorithm for solving large-scale convex composite
optimization models with an emphasis on the \ell_1-regularized least squares regression …

Linear convergence of proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth minimization problems

B Wen, X Chen, TK Pong - SIAM Journal on Optimization, 2017 - SIAM
In this paper, we study the proximal gradient algorithm with extrapolation for minimizing the
sum of a Lipschitz differentiable function and a proper closed convex function. Under the …

Rsg: Beating subgradient method without smoothness and strong convexity

T Yang, Q Lin - Journal of Machine Learning Research, 2018 - jmlr.org
In this paper, we study the efficiency of a Restarted SubGradient (RSG) method that
periodically restarts the standard subgradient method (SG). We show that, when applied to a …

A unified analysis of descent sequences in weakly convex optimization, including convergence rates for bundle methods

F Atenas, C Sagastizábal, PJS Silva, M Solodov - SIAM Journal on …, 2023 - SIAM
We present a framework for analyzing convergence and local rates of convergence of a
class of descent algorithms, assuming the objective function is weakly convex. The …