フォロー
Mengzhao Chen (陈锰钊)
Mengzhao Chen (陈锰钊)
その他の名前Mengzhao Chen
確認したメール アドレス: connect.hku.hk - ホームページ
タイトル
引用先
引用先
Omniquant: Omnidirectionally calibrated quantization for large language models
W Shao*, M Chen*, Z Zhang, P Xu, L Zhao, Z Li, K Zhang, P Gao, Y Qiao, ...
ICLR2024 spotlight (* equal contribution), 2023
1802023
Cf-vit: A general coarse-to-fine method for vision transformer
M Chen, M Lin, K Li, Y Shen, Y Wu, F Chao, R Ji
AAAI 2023 Oral, 2023
722023
Diffrate: Differentiable compression rate for efficient vision transformers
M Chen, W Shao, P Xu, M Lin, K Zhang, F Chao, R Ji, Y Qiao, P Luo
ICCV 2023, 2023
452023
Super vision transformer
M Lin*, M Chen*, Y Zhang, C Shen, R Ji, L Cao
IJCV 2023 (* equal contribution), 2023
272023
Efficientqat: Efficient quantization-aware training for large language models
M Chen, W Shao, P Xu, J Wang, P Gao, K Zhang, P Luo
arXiv preprint arXiv:2407.11062, 2024
232024
Fine-grained data distribution alignment for post-training quantization
Y Zhong, M Lin, M Chen, K Li, Y Shen, F Chao, Y Wu, R Ji
ECCV 2022, 2022
212022
Smmix: Self-motivated image mixing for vision transformers
M Chen, M Lin, Z Lin, Y Zhang, F Chao, R Ji
ICCV 2023, 2023
122023
Besa: Pruning large language models with blockwise parameter-efficient sparsity allocation
P Xu, W Shao, M Chen, S Tang, K Zhang, P Gao, F An, Y Qiao, P Luo
ICLR 2024, 2024
102024
I&S-ViT: An Inclusive & Stable Method for Pushing the Limit of Post-Training ViTs Quantization
Y Zhong, J Hu, M Lin, M Chen, R Ji
arXiv preprint arXiv:2311.10126, 2023
72023
OptG: Optimizing Gradient-driven Criteria in Network Sparsity
Y Zhang, M Lin, M Chen, F Chao, R Ji
arXiv preprint arXiv:2201.12826, 2022
42022
Adapting llama decoder to vision transformer
J Wang, W Shao, M Chen, C Wu, Y Liu, T Wu, K Zhang, S Zhang, K Chen, ...
arXiv preprint arXiv:2404.06773, 2024
32024
PrefixQuant: Static Quantization Beats Dynamic through Prefixed Outliers in LLMs
M Chen, Y Liu, J Wang, Y Bin, W Shao, P Luo
arXiv preprint arXiv:2410.05265, 2024
22024
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–12