Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
In this paper, we propose in a Hilbertian setting a second-order time-continuous dynamic
system with fast convergence guarantees to solve structured convex minimization problems …
system with fast convergence guarantees to solve structured convex minimization problems …
Robust multiple subspaces transfer for heterogeneous domain adaptation
Heterogeneous domain adaptation (HDA) aims to execute knowledge transfer from a source
domain to a heterogeneous target domain. Previous works typically inject knowledge from …
domain to a heterogeneous target domain. Previous works typically inject knowledge from …
Convergence results of two-step inertial proximal point algorithm
This paper proposes a two-point inertial proximal point algorithm to find zero of maximal
monotone operators in Hilbert spaces. We obtain weak convergence results and non …
monotone operators in Hilbert spaces. We obtain weak convergence results and non …
Anderson acceleration of proximal gradient methods
V Mai, M Johansson - International Conference on Machine …, 2020 - proceedings.mlr.press
Anderson acceleration is a well-established and simple technique for speeding up fixed-
point computations with countless applications. This work introduces novel methods for …
point computations with countless applications. This work introduces novel methods for …
Beyond l1: Faster and better sparse models with skglm
We propose a new fast algorithm to estimate any sparse generalized linear model with
convex or non-convex separable penalties. Our algorithm is able to solve problems with …
convex or non-convex separable penalties. Our algorithm is able to solve problems with …
Strongly convergent inertial proximal point algorithm without on-line rule
We present a strongly convergent Halpern-type proximal point algorithm with double inertial
effects to find a zero of a maximal monotone operator in Hilbert spaces. The strong …
effects to find a zero of a maximal monotone operator in Hilbert spaces. The strong …
Modified proximal gradient methods involving double inertial extrapolations for monotone inclusion
In this work, we propose a novel class of forward‐backward‐forward algorithms for solving
monotone inclusion problems. Our approach incorporates a self‐adaptive technique to …
monotone inclusion problems. Our approach incorporates a self‐adaptive technique to …
Anderson acceleration for nonconvex ADMM based on Douglas‐Rachford splitting
The alternating direction multiplier method (ADMM) is widely used in computer graphics for
solving optimization problems that can be nonsmooth and nonconvex. It converges quickly …
solving optimization problems that can be nonsmooth and nonconvex. It converges quickly …
On the asymptotic linear convergence speed of Anderson acceleration applied to ADMM
Empirical results show that Anderson acceleration (AA) can be a powerful mechanism to
improve the asymptotic linear convergence speed of the Alternating Direction Method of …
improve the asymptotic linear convergence speed of the Alternating Direction Method of …
Geometry of first-order methods and adaptive acceleration
First-order operator splitting methods are ubiquitous among many fields through science
and engineering, such as inverse problems, signal/image processing, statistics, data …
and engineering, such as inverse problems, signal/image processing, statistics, data …