LDD: High-Precision Training of Deep Spiking Neural Network Transformers Guided by an Artificial Neural Network
The rise of large-scale Transformers has led to challenges regarding computational costs
and energy consumption. In this context, spiking neural networks (SNNs) offer potential …
and energy consumption. In this context, spiking neural networks (SNNs) offer potential …
Efficient Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
Spiking Neural Networks (SNNs) are emerging as a brain-inspired alternative to traditional
Artificial Neural Networks (ANNs), prized for their potential energy efficiency on …
Artificial Neural Networks (ANNs), prized for their potential energy efficiency on …
Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization
The proliferation of Artificial Neural Networks (ANNs) has led to increased energy
consumption, raising concerns about their sustainability. Spiking Neural Networks (SNNs) …
consumption, raising concerns about their sustainability. Spiking Neural Networks (SNNs) …
An Event-based Feature Representation Method for Event Stream Classification using Deep Spiking Neural Networks
Event streams output by event cameras have low data redundancy and retain accurate
temporal information in the form of Address Event Representation (AER) which are different …
temporal information in the form of Address Event Representation (AER) which are different …
Boosting Knowledge Distillation Via Local Categories Similarity Scaling
D Chen, X Shen, X Teng, L Lan - Available at SSRN 5022526 - papers.ssrn.com
Abstract Knowledge distillation (KD) is a technique for transferring knowledge from a pre-
trained deep teacher network to a shallower student model. Most existing KD methods alter …
trained deep teacher network to a shallower student model. Most existing KD methods alter …