Obserwuj
Hongyi Li
Hongyi Li
Zweryfikowany adres z mails.tsinghua.edu.cn
Tytuł
Cytowane przez
Cytowane przez
Rok
Brain-inspired multimodal hybrid neural network for robot place recognition
F Yu, Y Wu, S Ma, M Xu, H Li, H Qu, C Song, T Wang, R Zhao, L Shi
Science Robotics 8 (78), eabm6996, 2023
462023
Neuromorphic computing chip with spatiotemporal elasticity for multi-intelligent-tasking robots
S Ma, J Pei, W Zhang, G Wang, D Feng, F Yu, C Song, H Qu, C Ma, M Lu, ...
Science Robotics 7 (67), eabk2948, 2022
412022
HASP: Hierarchical asynchronous parallelism for multi-NN tasks
H Li, S Ma, T Wang, W Zhang, G Wang, C Song, H Qu, J Lin, C Ma, J Pei, ...
IEEE Transactions on Computers, 2023
62023
Adaptive Synaptic Scaling in Spiking Networks for Continual Learning and Enhanced Robustness
M Xu, F Liu, Y Hu, H Li, Y Wei, S Zhong, J Pei, L Deng
IEEE Transactions on Neural Networks and Learning Systems, 2024
42024
SongC: A compiler for hybrid near-memory and in-memory many-core architecture
J Lin, H Qu, S Ma, X Ji, H Li, X Li, C Song, W Zhang
IEEE Transactions on Computers, 2023
32023
Efficient GCN Deployment with Spiking Property on Spatial-Temporal Neuromorphic Chips
H Li, M Xu, J Pei, R Zhao
Proceedings of the 2023 International Conference on Neuromorphic Systems, 1-8, 2023
12023
RoboSpike: Fully Utilizing the Heterogeneous System with Subcallback Scheduling in ROS 2
H Li, Q Yang, S Ma, R Zhao, X Ji
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2025
2025
Hybrid neural networks for continual learning inspired by corticohippocampal circuits
Q Shi, F Liu, H Li, G Li, L Shi, R Zhao
Nature Communications 16 (1), 1272, 2025
2025
General-purpose Dataflow Model with Neuromorphic Primitives
W Zhang, Y Du, H Li, S Ma, R Zhao
arXiv preprint arXiv:2408.01090, 2024
2024
HASP: Hierarchical Asynchronous Parallelism for Multi-NN Tasks
H Li, S Ma, T Wang, W Zhang, G Wang, C Song, H Qu, J Lin, C Ma, J Pei, ...
IEEE Transactions on Computers 73 (2), 366 - 379, 2024
2024
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–10