Conditional gradient methods
G Braun, A Carderera, CW Combettes… - arxiv preprint arxiv …, 2022 - arxiv.org
The purpose of this survey is to serve both as a gentle introduction and a coherent overview
of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for …
of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for …
Learning with combinatorial optimization layers: a probabilistic approach
Combinatorial optimization (CO) layers in machine learning (ML) pipelines are a powerful
tool to tackle data-driven decision tasks, but they come with two main challenges. First, the …
tool to tackle data-driven decision tasks, but they come with two main challenges. First, the …
Pairwise conditional gradients without swap steps and sparser kernel herding
KK Tsuji, K Tanaka, S Pokutta - International Conference on …, 2022 - proceedings.mlr.press
Abstract The Pairwise Conditional Gradients (PCG) algorithm is a powerful extension of the
Frank-Wolfe algorithm leading to particularly sparse solutions, which makes PCG very …
Frank-Wolfe algorithm leading to particularly sparse solutions, which makes PCG very …
Improved local models and new Bell inequalities via Frank-Wolfe algorithms
In Bell scenarios with two outcomes per party, we algorithmically consider the two sides of
the membership problem for the local polytope: Constructing local models and deriving …
the membership problem for the local polytope: Constructing local models and deriving …
Scalable Frank–Wolfe on generalized self-concordant functions via simple steps
Generalized self-concordance is a key property present in the objective function of many
important learning problems. We establish the convergence rate of a simple Frank–Wolfe …
important learning problems. We establish the convergence rate of a simple Frank–Wolfe …
Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions
Generalized self-concordance is a key property present in the objective function of many
important learning problems. We establish the convergence rate of a simple Frank-Wolfe …
important learning problems. We establish the convergence rate of a simple Frank-Wolfe …
Interpretable neural networks with frank-wolfe: Sparse relevance maps and relevance orderings
We study the effects of constrained optimization formulations and Frank-Wolfe algorithms for
obtaining interpretable neural network predictions. Reformulating the Rate-Distortion …
obtaining interpretable neural network predictions. Reformulating the Rate-Distortion …
Improved algorithms and novel applications of the FrankWolfe. jl library
Frank-Wolfe (FW) algorithms have emerged as an essential class of methods for constrained
optimization, especially on large-scale problems. In this paper, we summarize the …
optimization, especially on large-scale problems. In this paper, we summarize the …
Better bounds on Grothendieck constants of finite orders
Grothendieck constants $ K_G (d) $ bound the advantage of $ d $-dimensional strategies
over $1 $-dimensional ones in a specific optimisation task. They have applications ranging …
over $1 $-dimensional ones in a specific optimisation task. They have applications ranging …
Solving the Optimal Experiment Design Problem with Mixed-Integer Convex Methods
We tackle the Optimal Experiment Design Problem, which consists of choosing experiments
to run or observations to select from a finite set to estimate the parameters of a system. The …
to run or observations to select from a finite set to estimate the parameters of a system. The …