Follow
Hao Yu
Hao Yu
Verified email at lamda.nju.edu.cn - Homepage
Title
Cited by
Cited by
Year
A unified pruning framework for vision transformers
H Yu, J Wu
Science China Information Sciences 66 (7), 179101, 2023
632023
Training vision transformers with only 2040 images
YH Cao, H Yu, J Wu
European Conference on Computer Vision, 220-237, 2022
592022
Mixup without hesitation
H Yu, H Wang, J Wu
Image and Graphics: 11th International Conference, ICIG 2021, Haikou, China …, 2021
342021
Compressing transformers: features are low-rank, but weights are not!
H Yu, J Wu
Proceedings of the AAAI Conference on Artificial Intelligence 37 (9), 11007 …, 2023
332023
Fast k-means clustering with Anderson acceleration
J Zhang, Y Yao, Y Peng, H Yu, B Deng
arXiv preprint arXiv:1805.10638, 2018
102018
Effectively Compress KV Heads for LLM
H Yu, Z Yang, S Li, Y Li, J Wu
arXiv preprint arXiv:2406.07056, 2024
82024
Reviving undersampling for long-tailed learning
H Yu, Y Du, J Wu
Pattern Recognition 161, 111200, 2025
22025
Treasures in Discarded Weights for LLM Quantization
H Yu, Y Zhou, B Chen, Z Yang, S Li, Y Li, J Wu
2025
Quantization without Tears
M Fu, H Yu, J Shao, J Zhou, K Zhu, J Wu
arXiv preprint arXiv:2411.13918, 2024
2024
Unified Low-rank Compression Framework for Click-through Rate Prediction
H Yu, M Fu, J Ding, Y Zhou, J Wu
Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and …, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–10