Convex optimization algorithms in medical image reconstruction—in the age of AI
The past decade has seen the rapid growth of model based image reconstruction (MBIR)
algorithms, which are often applications or adaptations of convex optimization algorithms …
algorithms, which are often applications or adaptations of convex optimization algorithms …
[BOOK][B] Modern nonconvex nondifferentiable optimization
Mathematical optimization has always been at the heart of engineering, statistics, and
economics. In these applied domains, optimization concepts and methods have often been …
economics. In these applied domains, optimization concepts and methods have often been …
Convex-concave backtracking for inertial Bregman proximal gradient algorithms in nonconvex optimization
Backtracking line-search is an old yet powerful strategy for finding better step sizes to be
used in proximal gradient algorithms. The main principle is to locally find a simple convex …
used in proximal gradient algorithms. The main principle is to locally find a simple convex …
Convergence of the momentum method for semialgebraic functions with locally Lipschitz gradients
We propose a new length formula that governs the iterates of the momentum method when
minimizing differentiable semialgebraic functions with locally Lipschitz gradients. It enables …
minimizing differentiable semialgebraic functions with locally Lipschitz gradients. It enables …
Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
This paper proposes an inertial Bregman proximal gradient method for minimizing the sum
of two possibly nonconvex functions. This method includes two different inertial steps and …
of two possibly nonconvex functions. This method includes two different inertial steps and …
Adaptive restart of accelerated gradient methods under local quadratic growth condition
By analyzing accelerated proximal gradient methods under a local quadratic growth
condition, we show that restarting these algorithms at any frequency gives a globally linearly …
condition, we show that restarting these algorithms at any frequency gives a globally linearly …
General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems
Z Wu, M Li - Computational Optimization and Applications, 2019 - Springer
In this paper, we consider a general inertial proximal gradient method with constant and
variable stepsizes for a class of nonconvex nonsmooth optimization problems. The …
variable stepsizes for a class of nonconvex nonsmooth optimization problems. The …
Joint sparse optimization: lower-order regularization method and application in cell fate conversion
Multiple measurement signals are commonly collected in practical applications, and joint
sparse optimization adopts the synchronous effect within multiple measurement signals to …
sparse optimization adopts the synchronous effect within multiple measurement signals to …
A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo–Tseng error bound property
We propose a new family of inexact sequential quadratic approximation (SQA) methods,
which we call the inexact regularized proximal Newton (IRPN) method, for minimizing the …
which we call the inexact regularized proximal Newton (IRPN) method, for minimizing the …
A block symmetric Gauss–Seidel decomposition theorem for convex composite quadratic programming and its applications
For a symmetric positive semidefinite linear system of equations Q x= b Q x= b, where
x=(x_1, ..., x_s) x=(x 1,…, xs) is partitioned into s blocks, with s ≥ 2 s≥ 2, we show that each …
x=(x_1, ..., x_s) x=(x 1,…, xs) is partitioned into s blocks, with s ≥ 2 s≥ 2, we show that each …