Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Efficient spiking neural networks with sparse selective activation for continual learning
The next generation of machine intelligence requires the capability of continual learning to
acquire new knowledge without forgetting the old one while conserving limited computing …
acquire new knowledge without forgetting the old one while conserving limited computing …
Discrete key-value bottleneck
Deep neural networks perform well on classification tasks where data streams are iid and
labeled data is abundant. Challenges emerge with non-stationary training data streams …
labeled data is abundant. Challenges emerge with non-stationary training data streams …
Emergence of sparse representations from noise
A hallmark of biological neural networks, which distinguishes them from their artificial
counterparts, is the high degree of sparsity in their activations. This discrepancy raises three …
counterparts, is the high degree of sparsity in their activations. This discrepancy raises three …
Continual neural computation
Continuously processing a stream of not-iid data by neural models with the goal of
progressively learning new skills is largely known to introduce significant challenges …
progressively learning new skills is largely known to introduce significant challenges …
Elephant neural networks: Born to be a continual learner
Catastrophic forgetting remains a significant challenge to continual learning for decades.
While recent works have proposed effective methods to mitigate this problem, they mainly …
While recent works have proposed effective methods to mitigate this problem, they mainly …
Reducing catastrophic forgetting with associative learning: a lesson from fruit flies
Catastrophic forgetting remains an outstanding challenge in continual learning. Recently,
methods inspired by the brain, such as continual representation learning and memory …
methods inspired by the brain, such as continual representation learning and memory …
Generating Prompts in Latent Space for Rehearsal-free Continual Learning
Continual learning emerges as a framework that trains the model on a sequence of tasks
without forgetting previously learned knowledge, which has been applied in multiple …
without forgetting previously learned knowledge, which has been applied in multiple …
Key-value memory in the brain
Classical models of memory in psychology and neuroscience rely on similarity-based
retrieval of stored patterns, where similarity is a function of retrieval cues and the stored …
retrieval of stored patterns, where similarity is a function of retrieval cues and the stored …
Comply: Learning Sentences with Complex Weights inspired by Fruit Fly Olfaction
A Figueroa, J Westerhoff, A Golzar, D Fast… - arxiv preprint arxiv …, 2025 - arxiv.org
Biologically inspired neural networks offer alternative avenues to model data distributions.
FlyVec is a recent example that draws inspiration from the fruit fly's olfactory circuit to tackle …
FlyVec is a recent example that draws inspiration from the fruit fly's olfactory circuit to tackle …
Associative memory under the probabilistic lens: Improved transformers & dynamic memory creation
Clustering is a fundamental unsupervised learning problem, and recent work showed
modern continuous associative memory (AM) networks can learn to cluster data via a novel …
modern continuous associative memory (AM) networks can learn to cluster data via a novel …