Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Neural collapse: A review on modelling principles and generalization
V Kothapalli - arxiv preprint arxiv:2206.04041, 2022 - arxiv.org
Deep classifier neural networks enter the terminal phase of training (TPT) when training
error reaches zero and tend to exhibit intriguing Neural Collapse (NC) properties. Neural …
error reaches zero and tend to exhibit intriguing Neural Collapse (NC) properties. Neural …
A geometric analysis of neural collapse with unconstrained features
We provide the first global optimization landscape analysis of Neural Collapse--an intriguing
empirical phenomenon that arises in the last-layer classifiers and features of neural …
empirical phenomenon that arises in the last-layer classifiers and features of neural …
Inducing neural collapse in imbalanced learning: Do we really need a learnable classifier at the end of deep neural network?
Modern deep neural networks for classification usually jointly learn a backbone for
representation and a linear classifier to output the logit of each class. A recent study has …
representation and a linear classifier to output the logit of each class. A recent study has …
On the optimization landscape of neural collapse under mse loss: Global optimality with unconstrained features
When training deep neural networks for classification tasks, an intriguing empirical
phenomenon has been widely observed in the last-layer classifiers and features, where (i) …
phenomenon has been widely observed in the last-layer classifiers and features, where (i) …
Understanding imbalanced semantic segmentation through neural collapse
A recent study has shown a phenomenon called neural collapse in that the within-class
means of features and the classifier weight vectors converge to the vertices of a simplex …
means of features and the classifier weight vectors converge to the vertices of a simplex …
Imbalance trouble: Revisiting neural-collapse geometry
C Thrampoulidis, GR Kini… - Advances in Neural …, 2022 - proceedings.neurips.cc
Neural Collapse refers to the remarkable structural properties characterizing the geometry of
class embeddings and classifier weights, found by deep nets when trained beyond zero …
class embeddings and classifier weights, found by deep nets when trained beyond zero …
Feature learning in deep classifiers through intermediate neural collapse
A Rangamani, M Lindegaard… - … on machine learning, 2023 - proceedings.mlr.press
In this paper, we conduct an empirical study of the feature learning process in deep
classifiers. Recent research has identified a training phenomenon called Neural Collapse …
classifiers. Recent research has identified a training phenomenon called Neural Collapse …
Extended unconstrained features model for exploring deep neural collapse
The modern strategy for training deep neural networks for classification tasks includes
optimizing the network's weights even after the training error vanishes to further push the …
optimizing the network's weights even after the training error vanishes to further push the …
Are all losses created equal: A neural collapse perspective
While cross entropy (CE) is the most commonly used loss function to train deep neural
networks for classification tasks, many alternative losses have been developed to obtain …
networks for classification tasks, many alternative losses have been developed to obtain …
No fear of classifier biases: Neural collapse inspired federated learning with synthetic and fixed classifier
Data heterogeneity is an inherent challenge that hinders the performance of federated
learning (FL). Recent studies have identified the biased classifiers of local models as the key …
learning (FL). Recent studies have identified the biased classifiers of local models as the key …