Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Adversarial robustness of neural networks from the perspective of lipschitz calculus: A survey
We survey the adversarial robustness of neural networks from the perspective of Lipschitz
calculus in a unifying fashion by expressing models, attacks and safety guarantees—that is …
calculus in a unifying fashion by expressing models, attacks and safety guarantees—that is …
[HTML][HTML] Chordal and factor-width decompositions for scalable semidefinite and polynomial optimization
Chordal and factor-width decomposition methods for semidefinite programming and
polynomial optimization have recently enabled the analysis and control of large-scale linear …
polynomial optimization have recently enabled the analysis and control of large-scale linear …
When deep learning meets polyhedral theory: A survey
In the past decade, deep learning became the prevalent methodology for predictive
modeling thanks to the remarkable accuracy of deep neural networks in tasks such as …
modeling thanks to the remarkable accuracy of deep neural networks in tasks such as …
CS-TSSOS: Correlative and term sparsity for large-scale polynomial optimization
This work proposes a new moment-SOS hierarchy, called CS-TSSOS, for solving large-
scale sparse polynomial optimization problems. Its novelty is to exploit simultaneously …
scale sparse polynomial optimization problems. Its novelty is to exploit simultaneously …
Chordal-TSSOS: a moment-SOS hierarchy that exploits term sparsity with chordal extension
This work is a follow-up and a complement to [J. Wang, V. Magron and JB Lasserre, preprint,
arxiv: 1912.08899, 2019] where the TSSOS hierarchy was proposed for solving polynomial …
arxiv: 1912.08899, 2019] where the TSSOS hierarchy was proposed for solving polynomial …
Certified robustness via dynamic margin maximization and improved lipschitz regularization
To improve the robustness of deep classifiers against adversarial perturbations, many
approaches have been proposed, such as designing new architectures with better …
approaches have been proposed, such as designing new architectures with better …
Hybrid ISTA: Unfolding ISTA with convergence guarantees using free-form deep neural networks
It is promising to solve linear inverse problems by unfolding iterative algorithms (eg, iterative
shrinkage thresholding algorithm (ISTA)) as deep neural networks (DNNs) with learnable …
shrinkage thresholding algorithm (ISTA)) as deep neural networks (DNNs) with learnable …
On the scalability and memory efficiency of semidefinite programs for Lipschitz constant estimation of neural networks
Lipschitz constant estimation plays an important role in understanding generalization,
robustness, and fairness in deep learning. Unlike naive bounds based on the network …
robustness, and fairness in deep learning. Unlike naive bounds based on the network …
EMG-based automatic gesture recognition using lipschitz-regularized neural networks
This article introduces a novel approach for building a robust Automatic Gesture Recognition
system based on Surface Electromyographic (sEMG) signals, acquired at the forearm level …
system based on Surface Electromyographic (sEMG) signals, acquired at the forearm level …
Sparse noncommutative polynomial optimization
This article focuses on optimization of polynomials in noncommuting variables, while taking
into account sparsity in the input data. A converging hierarchy of semidefinite relaxations for …
into account sparsity in the input data. A converging hierarchy of semidefinite relaxations for …