Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Harnessing structures in big data via guaranteed low-rank matrix estimation: Recent theory and fast algorithms via convex and nonconvex optimization
Low-rank modeling plays a pivotal role in signal processing and machine learning, with
applications ranging from collaborative filtering, video surveillance, and medical imaging to …
applications ranging from collaborative filtering, video surveillance, and medical imaging to …
Chain of lora: Efficient fine-tuning of language models via residual learning
Fine-tuning is the primary methodology for tailoring pre-trained large language models to
specific tasks. As the model's scale and the diversity of tasks expand, parameter-efficient fine …
specific tasks. As the model's scale and the diversity of tasks expand, parameter-efficient fine …
Introduction to online convex optimization
E Hazan - Foundations and Trends® in Optimization, 2016 - nowpublishers.com
This monograph portrays optimization as a process. In many practical applications the
environment is so complex that it is infeasible to lay out a comprehensive theoretical model …
environment is so complex that it is infeasible to lay out a comprehensive theoretical model …
Breaking the linear iteration cost barrier for some well-known conditional gradient methods using maxip data-structures
Conditional gradient methods (CGM) are widely used in modern machine learning. CGM's
overall running time usually consists of two parts: the number of iterations and the cost of …
overall running time usually consists of two parts: the number of iterations and the cost of …
Faster projection-free online learning
In many online learning problems the computational bottleneck for gradient-based methods
is the projection operation. For this reason, in many problems the most efficient algorithms …
is the projection operation. For this reason, in many problems the most efficient algorithms …
Projection-free optimization on uniformly convex sets
Abstract The Frank-Wolfe method solves smooth constrained convex optimization problems
at a generic sublinear rate of $\mathcal {O}(1/T) $, and it (or its variants) enjoys accelerated …
at a generic sublinear rate of $\mathcal {O}(1/T) $, and it (or its variants) enjoys accelerated …
Fast projection onto convex smooth constraints
The Euclidean projection onto a convex set is an important problem that arises in numerous
constrained optimization tasks. Unfortunately, in many cases, computing projections is …
constrained optimization tasks. Unfortunately, in many cases, computing projections is …
Iterative hard thresholding with adaptive regularization: Sparser solutions without sacrificing runtime
We propose a simple modification to the iterative hard thresholding (IHT) algorithm, which
recovers asymptotically sparser solutions as a function of the condition number. When …
recovers asymptotically sparser solutions as a function of the condition number. When …
Minimally distorted structured adversarial attacks
White box adversarial perturbations are generated via iterative optimization algorithms most
often by minimizing an adversarial loss on a ℓ p neighborhood of the original image, the so …
often by minimizing an adversarial loss on a ℓ p neighborhood of the original image, the so …
Improved regret bounds for projection-free bandit convex optimization
We revisit the challenge of designing online algorithms for the bandit convex optimization
problem (BCO) which are also scalable to high dimensional problems. Hence, we consider …
problem (BCO) which are also scalable to high dimensional problems. Hence, we consider …