Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Piecewise linear neural networks and deep learning
As a powerful modelling method, piecewise linear neural networks (PWLNNs) have proven
successful in various fields, most recently in deep learning. To apply PWLNN methods, both …
successful in various fields, most recently in deep learning. To apply PWLNN methods, both …
When deep learning meets polyhedral theory: A survey
In the past decade, deep learning became the prevalent methodology for predictive
modeling thanks to the remarkable accuracy of deep neural networks in tasks such as …
modeling thanks to the remarkable accuracy of deep neural networks in tasks such as …
[HTML][HTML] Synchronization-enhanced deep learning early flood risk predictions: The core of data-driven city digital twins for climate resilience planning
Floods have been among the costliest hydrometeorological hazards across the globe for
decades, and are expected to become even more frequent and cause larger devastating …
decades, and are expected to become even more frequent and cause larger devastating …
Unraveling attention via convex duality: Analysis and interpretations of vision transformers
Vision transformers using self-attention or its proposed alternatives have demonstrated
promising results in many image related tasks. However, the underpinning inductive bias of …
promising results in many image related tasks. However, the underpinning inductive bias of …
Fast convex optimization for two-layer relu networks: Equivalent model classes and cone decompositions
We develop fast algorithms and robust software for convex optimization of two-layer neural
networks with ReLU activation functions. Our work leverages a convex re-formulation of the …
networks with ReLU activation functions. Our work leverages a convex re-formulation of the …
Vector-output relu neural network problems are copositive programs: Convex analysis of two layer networks and polynomial-time algorithms
We describe the convex semi-infinite dual of the two-layer vector-output ReLU neural
network training problem. This semi-infinite dual admits a finite dimensional representation …
network training problem. This semi-infinite dual admits a finite dimensional representation …
Optimal sets and solution paths of relu networks
We develop an analytical framework to characterize the set of optimal ReLU neural networks
by reformulating the non-convex training problem as a convex program. We show that the …
by reformulating the non-convex training problem as a convex program. We show that the …
Efficient global optimization of two-layer relu networks: Quadratic-time algorithms and adversarial training
The nonconvexity of the artificial neural network (ANN) training landscape brings
optimization difficulties. While the traditional back-propagation stochastic gradient descent …
optimization difficulties. While the traditional back-propagation stochastic gradient descent …
Demystifying batch normalization in relu networks: Equivalent convex optimization models and implicit regularization
Batch Normalization (BN) is a commonly used technique to accelerate and stabilize training
of deep neural networks. Despite its empirical success, a full theoretical understanding of …
of deep neural networks. Despite its empirical success, a full theoretical understanding of …
Convex relaxations of relu neural networks approximate global optima in polynomial time
In this paper, we study the optimality gap between two-layer ReLU networks regularized with
weight decay and their convex relaxations. We show that when the training data is random …
weight decay and their convex relaxations. We show that when the training data is random …