Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Learning curves of generic features maps for realistic datasets with a teacher-student model
Teacher-student models provide a framework in which the typical-case performance of high-
dimensional supervised learning can be described in closed form. The assumptions of …
dimensional supervised learning can be described in closed form. The assumptions of …
[KNIHA][B] Random matrix methods for machine learning
R Couillet, Z Liao - 2022 - books.google.com
This book presents a unified theory of random matrices for applications in machine learning,
offering a large-dimensional data vision that exploits concentration and universality …
offering a large-dimensional data vision that exploits concentration and universality …
Geometric dataset distances via optimal transport
The notion of task similarity is at the core of various machine learning paradigms, such as
domain adaptation and meta-learning. Current methods to quantify it are often heuristic …
domain adaptation and meta-learning. Current methods to quantify it are often heuristic …
Generalisation error in learning with random features and the hidden manifold model
We study generalised linear regression and classification for a synthetically generated
dataset encompassing different problems of interest, such as learning with random features …
dataset encompassing different problems of interest, such as learning with random features …
Universality laws for high-dimensional learning with random features
We prove a universality theorem for learning with random features. Our result shows that, in
terms of training and generalization errors, a random feature model with a nonlinear …
terms of training and generalization errors, a random feature model with a nonlinear …
Modeling the influence of data structure on learning in neural networks: The hidden manifold model
Understanding the reasons for the success of deep neural networks trained using stochastic
gradient-based methods is a key open problem for the nascent theory of deep learning. The …
gradient-based methods is a key open problem for the nascent theory of deep learning. The …
The gaussian equivalence of generative models for learning with shallow neural networks
Understanding the impact of data structure on the computational tractability of learning is a
key challenge for the theory of neural networks. Many theoretical works do not explicitly …
key challenge for the theory of neural networks. Many theoretical works do not explicitly …
A random matrix analysis of random fourier features: beyond the gaussian kernel, a precise phase transition, and the corresponding double descent
This article characterizes the exact asymptotics of random Fourier feature (RFF) regression,
in the realistic setting where the number of data samples $ n $, their dimension $ p $, and …
in the realistic setting where the number of data samples $ n $, their dimension $ p $, and …
[PDF][PDF] Learning gaussian mixtures with generalized linear models: Precise asymptotics in high-dimensions
Generalised linear models for multi-class classification problems are one of the fundamental
building blocks of modern machine learning tasks. In this manuscript, we characterise the …
building blocks of modern machine learning tasks. In this manuscript, we characterise the …
Universality laws for gaussian mixtures in generalized linear models
A recent line of work in high-dimensional statistics working under the Gaussian mixture
hypothesis has led to a number of results in the context of empirical risk minimization …
hypothesis has led to a number of results in the context of empirical risk minimization …