Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
On the complexity of approximating multimarginal optimal transport
We study the complexity of approximating the multimarginal optimal transport (MOT)
distance, a generalization of the classical optimal transport distance, considered here …
distance, a generalization of the classical optimal transport distance, considered here …
The approximate duality gap technique: A unified theory of first-order methods
We present a general technique for the analysis of first-order methods. The technique relies
on the construction of a duality gap for an appropriate approximation of the objective …
on the construction of a duality gap for an appropriate approximation of the objective …
Cyclic block coordinate descent with variance reduction for composite nonconvex optimization
Nonconvex optimization is central in solving many machine learning problems, in which
block-wise structure is commonly encountered. In this work, we propose cyclic block …
block-wise structure is commonly encountered. In this work, we propose cyclic block …
On a combination of alternating minimization and Nesterov's momentum
Alternating minimization (AM) procedures are practically efficient in many applications for
solving convex and non-convex optimization problems. On the other hand, Nesterov's …
solving convex and non-convex optimization problems. On the other hand, Nesterov's …
[PDF][PDF] Accelerated alternating minimization
Alternating minimization (AM) optimization algorithms have been known for a long time and
are of importance in machine learning problems, among which we are mostly motivated by …
are of importance in machine learning problems, among which we are mostly motivated by …
Block coordinate descent on smooth manifolds: Convergence theory and twenty-one examples
Block coordinate descent is an optimization paradigm that iteratively updates one block of
variables at a time, making it quite amenable to big data applications due to its scalability …
variables at a time, making it quite amenable to big data applications due to its scalability …
[HTML][HTML] First-order methods for convex optimization
First-order methods for solving convex optimization problems have been at the forefront of
mathematical optimization in the last 20 years. The rapid development of this important class …
mathematical optimization in the last 20 years. The rapid development of this important class …
Block-coordinate methods and restarting for solving extensive-form games
Coordinate descent methods are popular in machine learning and optimization for their
simple sparse updates and excellent practical performance. In the context of large-scale …
simple sparse updates and excellent practical performance. In the context of large-scale …
Accelerated cyclic coordinate dual averaging with extrapolation for composite convex optimization
Exploiting partial first-order information in a cyclic way is arguably the most natural strategy
to obtain scalable first-order methods. However, despite their wide use in practice, cyclic …
to obtain scalable first-order methods. However, despite their wide use in practice, cyclic …
Joint graph learning and blind separation of smooth graph signals using minimization of mutual information and laplacian quadratic forms
The smoothness of graph signals has found desirable real applications for processing
irregular (graph-based) signals. When the latent sources of the mixtures provided to us as …
irregular (graph-based) signals. When the latent sources of the mixtures provided to us as …