Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Adversarial robustness of neural networks from the perspective of lipschitz calculus: A survey
We survey the adversarial robustness of neural networks from the perspective of Lipschitz
calculus in a unifying fashion by expressing models, attacks and safety guarantees—that is …
calculus in a unifying fashion by expressing models, attacks and safety guarantees—that is …
Pay attention to your loss: understanding misconceptions about lipschitz neural networks
Lipschitz constrained networks have gathered considerable attention in the deep learning
community, with usages ranging from Wasserstein distance estimation to the training of …
community, with usages ranging from Wasserstein distance estimation to the training of …
Regularization of polynomial networks for image recognition
Abstract Deep Neural Networks (DNNs) have obtained impressive performance across
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …
Extrapolation and spectral bias of neural nets with hadamard product: a polynomial net study
Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural
networks and their generalization bounds. The study on NTK has been devoted to typical …
networks and their generalization bounds. The study on NTK has been devoted to typical …
Sound and complete verification of polynomial networks
Abstract Polynomial Networks (PNs) have demonstrated promising performance on face and
image recognition recently. However, robustness of PNs is unclear and thus obtaining …
image recognition recently. However, robustness of PNs is unclear and thus obtaining …
Tensor methods in deep learning
Tensors are multidimensional arrays that can naturally represent data and map**s of
multiple dimensions, playing a central role in modern deep learning. Indeed, the basic …
multiple dimensions, playing a central role in modern deep learning. Indeed, the basic …
Random polynomial neural networks: analysis and design
In this article, we propose the concept of random polynomial neural networks (RPNNs)
realized based on the architecture of polynomial neural networks (PNNs) with random …
realized based on the architecture of polynomial neural networks (PNNs) with random …
Architecture Design: From Neural Networks to Foundation Models
G Chrysos - 2024 IEEE 11th International Conference on Data …, 2024 - ieeexplore.ieee.org
Historically, we are taught to use task-dependent architecture design and objectives to
tackle data science tasks. Counter intuitively, this dogma has been proven (partly) wrong by …
tackle data science tasks. Counter intuitively, this dogma has been proven (partly) wrong by …
On the study of sample complexity for polynomial neural networks
C Pan, C Zhang - arxiv preprint arxiv:2207.08896, 2022 - arxiv.org
As a general type of machine learning approach, artificial neural networks have established
state-of-art benchmarks in many pattern recognition and data analysis tasks. Among various …
state-of-art benchmarks in many pattern recognition and data analysis tasks. Among various …
[PDF][PDF] 1-Path-Norm Regularization of Deep Neural Networks
The so-called path-norm measure is considered one of the best indicators for good
generalization of neural networks. This paper introduces a proximal gradient framework for …
generalization of neural networks. This paper introduces a proximal gradient framework for …