Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Ai alignment: A comprehensive survey
AI alignment aims to make AI systems behave in line with human intentions and values. As
AI systems grow more capable, so do risks from misalignment. To provide a comprehensive …
AI systems grow more capable, so do risks from misalignment. To provide a comprehensive …
Deep model fusion: A survey
Deep model fusion/merging is an emerging technique that merges the parameters or
predictions of multiple deep learning models into a single one. It combines the abilities of …
predictions of multiple deep learning models into a single one. It combines the abilities of …
Repair: Renormalizing permuted activations for interpolation repair
In this paper we look into the conjecture of Entezari et al.(2021) which states that if the
permutation invariance of neural networks is taken into account, then there is likely no loss …
permutation invariance of neural networks is taken into account, then there is likely no loss …
Mechanistic mode connectivity
We study neural network loss landscapes through the lens of mode connectivity, the
observation that minimizers of neural networks retrieved via training on a dataset are …
observation that minimizers of neural networks retrieved via training on a dataset are …
Class incremental learning with multi-teacher distillation
Distillation strategies are currently the primary approaches for mitigating forgetting in class
incremental learning (CIL). Existing methods generally inherit previous knowledge from a …
incremental learning (CIL). Existing methods generally inherit previous knowledge from a …
Proving linear mode connectivity of neural networks via optimal transport
The energy landscape of high-dimensional non-convex optimization problems is crucial to
understanding the effectiveness of modern deep neural network architectures. Recent works …
understanding the effectiveness of modern deep neural network architectures. Recent works …
The empirical impact of neural parameter symmetries, or lack thereof
Many algorithms and observed phenomena in deep learning appear to be affected by
parameter symmetries--transformations of neural network parameters that do not change the …
parameter symmetries--transformations of neural network parameters that do not change the …
Topological obstruction to the training of shallow ReLU neural networks
Studying the interplay between the geometry of the loss landscape and the optimization
trajectories of simple neural networks is a fundamental step for understanding their behavior …
trajectories of simple neural networks is a fundamental step for understanding their behavior …
Learning through atypical phase transitions in overparameterized neural networks
Current deep neural networks are highly overparameterized (up to billions of connection
weights) and nonlinear. Yet they can fit data almost perfectly through variants of gradient …
weights) and nonlinear. Yet they can fit data almost perfectly through variants of gradient …
Symmetries, flat minima, and the conserved quantities of gradient flow
Empirical studies of the loss landscape of deep networks have revealed that many local
minima are connected through low-loss valleys. Yet, little is known about the theoretical …
minima are connected through low-loss valleys. Yet, little is known about the theoretical …