Inducing neural collapse in imbalanced learning: Do we really need a learnable classifier at the end of deep neural network?

Y Yang, S Chen, X Li, L **e, Z Lin… - Advances in neural …, 2022 - proceedings.neurips.cc
Modern deep neural networks for classification usually jointly learn a backbone for
representation and a linear classifier to output the logit of each class. A recent study has …

Arcface: Additive angular margin loss for deep face recognition

J Deng, J Guo, N Xue… - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com
One of the main challenges in feature learning using Deep Convolutional Neural Networks
(DCNNs) for large-scale face recognition is the design of appropriate loss functions that can …

Neural collapse with normalized features: A geometric analysis over the riemannian manifold

C Yaras, P Wang, Z Zhu… - Advances in neural …, 2022 - proceedings.neurips.cc
When training overparameterized deep networks for classification tasks, it has been widely
observed that the learned features exhibit a so-called" neural collapse'" phenomenon. More …

Class-incremental learning with pre-allocated fixed classifiers

F Pernici, M Bruni, C Baecchi, F Turchini… - 2020 25th …, 2021 - ieeexplore.ieee.org
In class-incremental learning, a learning agent faces a stream of data with the goal of
learning new classes while not forgetting previous ones. Neural networks are known to …

Stationary representations: Optimally approximating compatibility and implications for improved model replacements

N Biondi, F Pernici, S Ricci… - Proceedings of the …, 2024 - openaccess.thecvf.com
Learning compatible representations enables the interchangeable use of semantic features
as models are updated over time. This is particularly relevant in search and retrieval systems …

Inducing neural collapse to a fixed hierarchy-aware frame for reducing mistake severity

T Liang, J Davis - … of the IEEE/CVF International Conference …, 2023 - openaccess.thecvf.com
There is a recently discovered and intriguing phenomenon called Neural Collapse: at the
terminal phase of training a deep neural network for classification, the within-class …

Regular polytope networks

F Pernici, M Bruni, C Baecchi… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Neural networks are widely used as a model for classification in a large variety of tasks.
Typically, a learnable transformation (ie, the classifier) is placed at the end of such models …

On modality bias recognition and reduction

Y Guo, L Nie, H Cheng, Z Cheng… - ACM Transactions on …, 2023 - dl.acm.org
Making each modality in multi-modal data contribute is of vital importance to learning a
versatile multi-modal model. Existing methods, however, are often dominated by one or few …

Cores: Compatible representations via stationarity

N Biondi, F Pernici, M Bruni… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Compatible features enable the direct comparison of old and new learned features allowing
to use them interchangeably over time. In visual search systems, this eliminates the need to …

Fine-grained adversarial semi-supervised learning

D Mugnai, F Pernici, F Turchini… - ACM Transactions on …, 2022 - dl.acm.org
In this article, we exploit Semi-Supervised Learning (SSL) to increase the amount of training
data to improve the performance of Fine-Grained Visual Categorization (FGVC). This …