Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …
perspectives: approximation, training dynamics, and generative models. In the first part …
Classification with deep neural networks and logistic loss
Z Zhang, L Shi, DX Zhou - Journal of Machine Learning Research, 2024 - jmlr.org
Deep neural networks (DNNs) trained with the logistic loss (also known as the cross entropy
loss) have made impressive advancements in various binary classification tasks. Despite the …
loss) have made impressive advancements in various binary classification tasks. Despite the …
Generalization analysis of deep CNNs under maximum correntropy criterion
Y Zhang, Z Fang, J Fan - Neural Networks, 2024 - Elsevier
Convolutional neural networks (CNNs) have gained immense popularity in recent years,
finding their utility in diverse fields such as image recognition, natural language processing …
finding their utility in diverse fields such as image recognition, natural language processing …
Optimal Rates of Approximation by Shallow ReLU Neural Networks and Applications to Nonparametric Regression
We study the approximation capacity of some variation spaces corresponding to shallow
ReLU k neural networks. It is shown that sufficiently smooth functions are contained in these …
ReLU k neural networks. It is shown that sufficiently smooth functions are contained in these …
Deeper or wider: A perspective from optimal generalization error with sobolev loss
Constructing the architecture of a neural network is a challenging pursuit for the machine
learning community, and the dilemma of whether to go deeper or wider remains a persistent …
learning community, and the dilemma of whether to go deeper or wider remains a persistent …
Approximation with cnns in sobolev space: with applications to classification
We derive a novel approximation error bound with explicit prefactor for Sobolev-regular
functions using deep convolutional neural networks (CNNs). The bound is non-asymptotic in …
functions using deep convolutional neural networks (CNNs). The bound is non-asymptotic in …
Approximation of nonlinear functionals using deep ReLU networks
In recent years, functional neural networks have been proposed and studied in order to
approximate nonlinear continuous functionals defined on L p ([-1, 1] s) for integers s≥ 1 and …
approximate nonlinear continuous functionals defined on L p ([-1, 1] s) for integers s≥ 1 and …
CNN models for readability of Chinese texts.
Readability of Chinese texts considered in this paper is a multi-class classification problem
with 12 grade classes corresponding to 6 grades in primary schools, 3 grades in middle …
with 12 grade classes corresponding to 6 grades in primary schools, 3 grades in middle …
On the rates of convergence for learning with convolutional neural networks
We study approximation and learning capacities of convolutional neural networks (CNNs)
with one-side zero-padding and multiple channels. Our first result proves a new …
with one-side zero-padding and multiple channels. Our first result proves a new …
Learning rates of deep nets for geometrically strongly mixing sequence
Y Men, L Li, Z Hu, Y Xu - IEEE Transactions on Neural …, 2024 - ieeexplore.ieee.org
The great success of deep learning poses an urgent challenge to establish the theoretical
basis for its working mechanism. Recently, research on the convergence of deep neural …
basis for its working mechanism. Recently, research on the convergence of deep neural …