Robust training under label noise by over-parameterization
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …
than training samples, have dominated the performances of modern machine learning …
Exploring lottery ticket hypothesis in spiking neural networks
Abstract Spiking Neural Networks (SNNs) have recently emerged as a new generation of
low-power deep neural networks, which is suitable to be implemented on low-power …
low-power deep neural networks, which is suitable to be implemented on low-power …
Ten lessons we have learned in the new" sparseland": A short handbook for sparse neural network researchers
This article does not propose any novel algorithm or new hardware for sparsity. Instead, it
aims to serve the" common good" for the increasingly prosperous Sparse Neural Network …
aims to serve the" common good" for the increasingly prosperous Sparse Neural Network …
Visual prompting upgrades neural network sparsification: A data-model perspective
The rapid development of large-scale deep learning models questions the affordability of
hardware platforms, which necessitates the pruning to reduce their computational and …
hardware platforms, which necessitates the pruning to reduce their computational and …