Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Cross-entropy loss functions: Theoretical analysis and applications
Cross-entropy is a widely used loss function in applications. It coincides with the logistic loss
applied to the outputs of a neural network, when the softmax is used. But, what guarantees …
applied to the outputs of a neural network, when the softmax is used. But, what guarantees …
Optimal learners for realizable regression: Pac learning and online learning
In this work, we aim to characterize the statistical complexity of realizable regression both in
the PAC learning setting and the online learning setting. Previous work had established the …
the PAC learning setting and the online learning setting. Previous work had established the …
Improved generalization bounds for robust learning
We consider a model of robust learning in an adversarial environment. The learner gets
uncorrupted training data with access to possible corruptions that may be effected by the …
uncorrupted training data with access to possible corruptions that may be effected by the …
Information complexity of stochastic convex optimization: Applications to generalization and memorization
In this work, we investigate the interplay between memorization and learning in the context
of\emph {stochastic convex optimization}(SCO). We define memorization via the information …
of\emph {stochastic convex optimization}(SCO). We define memorization via the information …
On the computability of robust pac learning
We initiate the study of computability requirements for adversarially robust learning.
Adversarially robust PAC-type learnability is by now an established field of research …
Adversarially robust PAC-type learnability is by now an established field of research …
Agnostic sample compression schemes for regression
We obtain the first positive results for bounded sample compression in the agnostic
regression setting with the $\ell_p $ loss, where $ p\in [1,\infty] $. We construct a generic …
regression setting with the $\ell_p $ loss, where $ p\in [1,\infty] $. We construct a generic …
Adversarially robust pac learnability of real-valued functions
We study robustness to test-time adversarial attacks in the regression setting with $\ell_p $
losses and arbitrary perturbation sets. We address the question of which function classes …
losses and arbitrary perturbation sets. We address the question of which function classes …
Uniformly stable algorithms for adversarial training and beyond
In adversarial machine learning, neural networks suffer from a significant issue known as
robust overfitting, where the robust test accuracy decreases over epochs (Rice et al., 2020) …
robust overfitting, where the robust test accuracy decreases over epochs (Rice et al., 2020) …
Adversarially robust learning with uncertain perturbation sets
In many real-world settings exact perturbation sets to be used by an adversary are not
plausibly available to a learner. While prior literature has studied both scenarios with …
plausibly available to a learner. While prior literature has studied both scenarios with …
Improved generalization bounds for adversarially robust learning
We consider a model of robust learning in an adversarial environment. The learner gets
uncorrupted training data with access to possible corruptions that may be affected by the …
uncorrupted training data with access to possible corruptions that may be affected by the …