Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Exact expressions for double descent and implicit regularization via surrogate random design
Double descent refers to the phase transition that is exhibited by the generalization error of
unregularized learning models when varying the ratio between the number of parameters …
unregularized learning models when varying the ratio between the number of parameters …
Recent and upcoming developments in randomized numerical linear algebra for machine learning
Large matrices arise in many machine learning and data analysis applications, including as
representations of datasets, graphs, model weights, and first and second-order derivatives …
representations of datasets, graphs, model weights, and first and second-order derivatives …
Newton-LESS: Sparsification without trade-offs for the sketched Newton update
In second-order optimization, a potential bottleneck can be computing the Hessian matrix of
the optimized function at every iteration. Randomized sketching has emerged as a powerful …
the optimized function at every iteration. Randomized sketching has emerged as a powerful …
Bagging in overparameterized learning: Risk characterization and risk monotonization
Bagging is a commonly used ensemble technique in statistics and machine learning to
improve the performance of prediction procedures. In this paper, we study the prediction risk …
improve the performance of prediction procedures. In this paper, we study the prediction risk …
Hessian averaging in stochastic Newton methods achieves superlinear convergence
We consider minimizing a smooth and strongly convex objective function using a stochastic
Newton method. At each iteration, the algorithm is given an oracle access to a stochastic …
Newton method. At each iteration, the algorithm is given an oracle access to a stochastic …
Asymptotics of the sketched pseudoinverse
We take a random matrix theory approach to random sketching and show an asymptotic first-
order equivalence of the regularized sketched pseudoinverse of a positive semidefinite …
order equivalence of the regularized sketched pseudoinverse of a positive semidefinite …
Precise expressions for random projections: Low-rank approximation and randomized Newton
It is often desirable to reduce the dimensionality of a large dataset by projecting it onto a low-
dimensional subspace. Matrix sketching has emerged as a powerful technique for …
dimensional subspace. Matrix sketching has emerged as a powerful technique for …
Sampling from a k-DPP without looking at all items
Determinantal point processes (DPPs) are a useful probabilistic model for selecting a small
diverse subset out of a large collection of items, with applications in summarization …
diverse subset out of a large collection of items, with applications in summarization …
Adaptive newton sketch: Linear-time optimization with quadratic convergence and effective hessian dimensionality
We propose a randomized algorithm with quadratic convergence rate for convex
optimization problems with a self-concordant, composite, strongly convex objective function …
optimization problems with a self-concordant, composite, strongly convex objective function …
Effective dimension adaptive sketching methods for faster regularized least-squares optimization
We propose a new randomized algorithm for solving L2-regularized least-squares problems
based on sketching. We consider two of the most popular random embeddings, namely …
based on sketching. We consider two of the most popular random embeddings, namely …