Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Machine learning and the physical sciences
Machine learning (ML) encompasses a broad range of algorithms and modeling tools used
for a vast array of data processing tasks, which has entered most scientific disciplines in …
for a vast array of data processing tasks, which has entered most scientific disciplines in …
A transdisciplinary review of deep learning research and its relevance for water resources scientists
Deep learning (DL), a new generation of artificial neural network research, has transformed
industries, daily lives, and various scientific disciplines in recent years. DL represents …
industries, daily lives, and various scientific disciplines in recent years. DL represents …
Computing nonvacuous generalization bounds for deep (stochastic) neural networks with many more parameters than training data
One of the defining properties of deep learning is that models are chosen to have many
more parameters than available training data. In light of this capacity for overfitting, it is …
more parameters than available training data. In light of this capacity for overfitting, it is …
Entropy-sgd: Biasing gradient descent into wide valleys
This paper proposes a new optimization algorithm called Entropy-SGD for training deep
neural networks that is motivated by the local geometry of the energy landscape. Local …
neural networks that is motivated by the local geometry of the energy landscape. Local …
Statistical mechanics of deep learning
The recent striking success of deep neural networks in machine learning raises profound
questions about the theoretical principles underlying their success. For example, what can …
questions about the theoretical principles underlying their success. For example, what can …
Empirical analysis of the hessian of over-parametrized neural networks
We study the properties of common loss surfaces through their Hessian matrix. In particular,
in the context of deep learning, we empirically show that the spectrum of the Hessian is …
in the context of deep learning, we empirically show that the spectrum of the Hessian is …
Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks
Stochastic gradient descent (SGD) is widely believed to perform implicit regularization when
used to train deep neural networks, but the precise manner in which this occurs has thus far …
used to train deep neural networks, but the precise manner in which this occurs has thus far …
Optimal errors and phase transitions in high-dimensional generalized linear models
Generalized linear models (GLMs) are used in high-dimensional machine learning,
statistics, communications, and signal processing. In this paper we analyze GLMs when the …
statistics, communications, and signal processing. In this paper we analyze GLMs when the …
Fast automated analysis of strong gravitational lenses with convolutional neural networks
Quantifying image distortions caused by strong gravitational lensing—the formation of
multiple images of distant sources due to the deflection of their light by the gravity of …
multiple images of distant sources due to the deflection of their light by the gravity of …
Implicit self-regularization in deep neural networks: Evidence from random matrix theory and implications for learning
Random Matrix Theory (RMT) is applied to analyze the weight matrices of Deep Neural
Networks (DNNs), including both production quality, pre-trained models such as AlexNet …
Networks (DNNs), including both production quality, pre-trained models such as AlexNet …