Towards High-performance Spiking Transformers from ANN to SNN Conversion

Z Huang, X Shi, Z Hao, T Bu, J Ding, Z Yu… - Proceedings of the 32nd …, 2024 - dl.acm.org
Spiking neural networks (SNNs) show great potential due to their energy efficiency, fast
processing capabilities, and robustness. There are two main approaches to constructing …

Hybrid neural networks for continual learning inspired by corticohippocampal circuits

Q Shi, F Liu, H Li, G Li, L Shi, R Zhao - Nature Communications, 2025 - nature.com
Current artificial systems suffer from catastrophic forgetting during continual learning, a
limitation absent in biological systems. Biological mechanisms leverage the dual …

Brain-inspired continual pre-trained learner via silent synaptic consolidation

X Ran, J Yao, Y Wang, M Xu, D Liu - arxiv preprint arxiv:2410.05899, 2024 - arxiv.org
Pre-trained models have demonstrated impressive generalization capabilities, yet they
remain vulnerable to catastrophic forgetting when incrementally trained on new tasks …

Enhancing Generalization and Convergence in Neural Networks through a Dual-Phase Regularization Approach with Excitatory-Inhibitory Transition

M Xu, H Yin, S Zhong - 2024 International Conference on …, 2024 - ieeexplore.ieee.org
Overfitting and slow convergence represent common hurdles encountered during the
training of deep neural networks (DNNs). While conventional regularization methods like …