Efficient recurrent architectures through activity sparsity and sparse back-propagation through time A Subramoney, KK Nazeer, M Schöne, C Mayr, D Kappel arXiv preprint arXiv:2206.06178, 2022 | 21 | 2022 |
SpiNNaker2: A large-scale neuromorphic system for event-based and asynchronous machine learning HA Gonzalez, J Huang, F Kelber, KK Nazeer, T Langer, C Liu, ... arXiv preprint arXiv:2401.04491, 2024 | 13 | 2024 |
Language Modeling on a SpiNNaker2 Neuromorphic Chip KK Nazeer, M Schöne, R Mukherji, B Vogginger, C Mayr, D Kappel, ... 2024 IEEE 6th International Conference on AI Circuits and Systems (AICAS …, 2024 | 4 | 2024 |
Block-local learning with probabilistic latent representations D Kappel, KK Nazeer, CT Fokam, C Mayr, A Subramoney arXiv preprint arXiv:2305.14974, 2023 | 4 | 2023 |
Activity sparsity complements weight sparsity for efficient RNN inference R Mukherji, M Schöne, KK Nazeer, C Mayr, A Subramoney arXiv preprint arXiv:2311.07625, 2023 | 2 | 2023 |
EGRU: Event-based GRU for activity-sparse inference and learning A Subramoney, KK Nazeer, M Schöne, C Mayr, D Kappel | 2 | 2022 |
STREAM: A Universal State-Space Model for Sparse Geometric Data M Schöne, Y Bhisikar, K Bania, KK Nazeer, C Mayr, A Subramoney, ... arXiv preprint arXiv:2411.12603, 2024 | | 2024 |
Asynchronous Stochastic Gradient Descent with Decoupled Backpropagation and Layer-Wise Updates CT Fokam, KK Nazeer, L König, D Kappel, A Subramoney arXiv preprint arXiv:2410.05985, 2024 | | 2024 |
Asynchronous Stochastic Gradient Descent with Decoupled Backpropagation and Layer-Wise Updates C Teguemne Fokam, KK Nazeer, L König, D Kappel, A Subramoney arXiv e-prints, arXiv: 2410.05985, 2024 | | 2024 |
Weight Sparsity Complements Activity Sparsity in Neuromorphic Language Models R Mukherji, M Schöne, KK Nazeer, C Mayr, D Kappel, A Subramoney arXiv preprint arXiv:2405.00433, 2024 | | 2024 |
A variational framework for local learning with probabilistic latent representations D Kappel, KK Nazeer, CT Fokam, C Mayr, A Subramoney 5th Workshop on practical ML for limited/low resource settings, 0 | | |
An efficient RNN Language Model using activity sparsity and sparse back-propagation through time M Schöne, KK Nazeer, C Mayr, D Kappel, A Subramoney | | |