Conditional gradient methods
G Braun, A Carderera, CW Combettes… - arxiv preprint arxiv …, 2022 - arxiv.org
The purpose of this survey is to serve both as a gentle introduction and a coherent overview
of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for …
of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for …
[BOOK][B] Minimum-volume ellipsoids: Theory and algorithms
MJ Todd - 2016 - SIAM
Optimization is concerned with choosing several variables to optimize (maximize or
minimize) an objective function, usually subject to several constraints. In the last twenty-five …
minimize) an objective function, usually subject to several constraints. In the last twenty-five …
Linearly convergent away-step conditional gradient for non-strongly convex functions
We consider the problem of minimizing the sum of a linear function and a composition of a
strongly convex function with a linear transformation over a compact polyhedral set. Jaggi …
strongly convex function with a linear transformation over a compact polyhedral set. Jaggi …
Projection-free optimization on uniformly convex sets
Abstract The Frank-Wolfe method solves smooth constrained convex optimization problems
at a generic sublinear rate of $\mathcal {O}(1/T) $, and it (or its variants) enjoys accelerated …
at a generic sublinear rate of $\mathcal {O}(1/T) $, and it (or its variants) enjoys accelerated …
New characterizations of Hoffman constants for systems of linear constraints
We give a characterization of the Hoffman constant of a system of linear constraints in R^ n
R n relative to a reference polyhedron R ⊆ R^ n R⊆ R n. The reference polyhedron R …
R n relative to a reference polyhedron R ⊆ R^ n R⊆ R n. The reference polyhedron R …
New Analysis of an Away-Step Frank–Wolfe Method for Minimizing Log-Homogeneous Barriers
R Zhao - Mathematics of Operations Research, 2025 - pubsonline.informs.org
We present and analyze an away-step Frank–Wolfe method for the convex optimization
problem min x∈ X f (A x)+〈 c, x〉, where f is a θ-logarithmically homogeneous self …
problem min x∈ X f (A x)+〈 c, x〉, where f is a θ-logarithmically homogeneous self …
Restarting frank-wolfe
Abstract Conditional Gradients (aka Frank-Wolfe algorithms) form a classical set of methods
for constrained smooth convex minimization due to their simplicity, the absence of projection …
for constrained smooth convex minimization due to their simplicity, the absence of projection …
Self-concordant analysis of Frank-Wolfe algorithms
Projection-free optimization via different variants of the Frank-Wolfe (FW), aka Conditional
Gradient method has become one of the cornerstones in optimization for machine learning …
Gradient method has become one of the cornerstones in optimization for machine learning …
Frank–Wolfe and friends: a journey into projection-free first-order optimization methods
Invented some 65 years ago in a seminal paper by Marguerite Straus-Frank and Philip
Wolfe, the Frank–Wolfe method recently enjoys a remarkable revival, fuelled by the need of …
Wolfe, the Frank–Wolfe method recently enjoys a remarkable revival, fuelled by the need of …
Revisiting frank-wolfe for polytopes: Strict complementarity and sparsity
D Garber - Advances in Neural Information Processing …, 2020 - proceedings.neurips.cc
In recent years it was proved that simple modifications of the classical Frank-Wolfe algorithm
(aka conditional gradient algorithm) for smooth convex minimization over convex and …
(aka conditional gradient algorithm) for smooth convex minimization over convex and …