Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A unified approach to domain incremental learning with memory: Theory and algorithm
H Shi, H Wang - Advances in Neural Information Processing …, 2023 - proceedings.neurips.cc
Abstract Domain incremental learning aims to adapt to a sequence of domains with access
to only a small subset of data (ie, memory) from previous domains. Various methods have …
to only a small subset of data (ie, memory) from previous domains. Various methods have …
Deep neural collapse is provably optimal for the deep unconstrained features model
P Súkeník, M Mondelli… - Advances in Neural …, 2023 - proceedings.neurips.cc
Neural collapse (NC) refers to the surprising structure of the last layer of deep neural
networks in the terminal phase of gradient descent training. Recently, an increasing amount …
networks in the terminal phase of gradient descent training. Recently, an increasing amount …
Compressible dynamics in deep overparameterized low-rank learning & adaptation
While overparameterization in machine learning models offers great benefits in terms of
optimization and generalization, it also leads to increased computational requirements as …
optimization and generalization, it also leads to increased computational requirements as …
Generalized neural collapse for a large number of classes
Neural collapse provides an elegant mathematical characterization of learned last layer
representations (aka features) and classifier weights in deep classification models. Such …
representations (aka features) and classifier weights in deep classification models. Such …
Neural collapse in deep linear networks: from balanced to imbalanced data
Modern deep neural networks have achieved impressive performance on tasks from image
classification to natural language processing. Surprisingly, these complex systems with …
classification to natural language processing. Surprisingly, these complex systems with …
Principled and efficient transfer learning of deep models via neural collapse
As model size continues to grow and access to labeled training data remains limited,
transfer learning has become a popular approach in many scientific and engineering fields …
transfer learning has become a popular approach in many scientific and engineering fields …
YOLO-adaptor: a fast adaptive one-stage detector for non-aligned visible-infrared object detection
Visible-infrared object detection has attracted increasing attention recently due to its
superior performance and cost-efficiency. Most existing methods focus on the detection of …
superior performance and cost-efficiency. Most existing methods focus on the detection of …
Understanding deep representation learning via layerwise feature compression and discrimination
Over the past decade, deep learning has proven to be a highly effective tool for learning
meaningful features from raw data. However, it remains an open question how deep …
meaningful features from raw data. However, it remains an open question how deep …
Navigate beyond shortcuts: Debiased learning through the lens of neural collapse
Recent studies have noted an intriguing phenomenon termed Neural Collapse that is when
the neural networks establish the right correlation between feature spaces and the training …
the neural networks establish the right correlation between feature spaces and the training …
The law of parsimony in gradient descent for learning deep linear networks
Over the past few years, an extensively studied phenomenon in training deep networks is
the implicit bias of gradient descent towards parsimonious solutions. In this work, we …
the implicit bias of gradient descent towards parsimonious solutions. In this work, we …