LDD: High-Precision Training of Deep Spiking Neural Network Transformers Guided by an Artificial Neural Network

Y Liu, C Zhao, Y Jiang, Y Fang, F Chen - Biomimetics, 2024 - pmc.ncbi.nlm.nih.gov
The rise of large-scale Transformers has led to challenges regarding computational costs
and energy consumption. In this context, spiking neural networks (SNNs) offer potential …

Efficient Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment

C Yu, X Zhao, L Liu, S Yang, G Wang, E Li… - arxiv preprint arxiv …, 2025 - arxiv.org
Spiking Neural Networks (SNNs) are emerging as a brain-inspired alternative to traditional
Artificial Neural Networks (ANNs), prized for their potential energy efficiency on …

Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization

L Deckers, B Vandersmissen, IJ Tsang… - arxiv preprint arxiv …, 2024 - arxiv.org
The proliferation of Artificial Neural Networks (ANNs) has led to increased energy
consumption, raising concerns about their sustainability. Spiking Neural Networks (SNNs) …

An Event-based Feature Representation Method for Event Stream Classification using Deep Spiking Neural Networks

L Liang, R Jiang, H Tang, R Yan - 2024 International Joint …, 2024 - ieeexplore.ieee.org
Event streams output by event cameras have low data redundancy and retain accurate
temporal information in the form of Address Event Representation (AER) which are different …

Boosting Knowledge Distillation Via Local Categories Similarity Scaling

D Chen, X Shen, X Teng, L Lan - Available at SSRN 5022526 - papers.ssrn.com
Abstract Knowledge distillation (KD) is a technique for transferring knowledge from a pre-
trained deep teacher network to a shallower student model. Most existing KD methods alter …