Sledovat
Zhenghao Lin
Zhenghao Lin
E-mailová adresa ověřena na: stu.xmu.edu.cn
Název
Citace
Citace
Rok
Annollm: Making large language models to be better crowdsourced annotators
X He, Z Lin, Y Gong, A Jin, H Zhang, C Lin, J Jiao, SM Yiu, N Duan, ...
arXiv preprint arXiv:2303.16854, 2023
1752023
Text generation with diffusion language models: A pre-training approach with continuous paragraph denoise
Z Lin, Y Gong, Y Shen, T Wu, Z Fan, C Lin, N Duan, W Chen
International Conference on Machine Learning, 21051-21064, 2023
76*2023
Rho-1: Not all tokens are what you need
Z Lin, Z Gou, Y Gong, X Liu, Y Shen, R Xu, C Lin, Y Yang, J Jiao, N Duan, ...
arXiv preprint arXiv:2404.07965, 2024
522024
Prod: Progressive distillation for dense retrieval
Z Lin, Y Gong, X Liu, H Zhang, C Lin, A Dong, J Jiao, J Lu, D Jiang, ...
WWW2023, 2022
292022
Sentiment-Aware Word and Sentence Level Pre-training for Sentiment Analysis
S Fan, C Lin, H Li, Z Lin, J Su, H Zhang, Y Gong, J Guo, N Duan
EMNLP2022, 2022
252022
Competition-level problems are effective llm evaluators
Y Huang, Z Lin, X Liu, Y Gong, S Lu, F Lei, Y Liang, Y Shen, C Lin, ...
arXiv preprint arXiv:2312.02143, 2023
172023
Ensuring safe and high-quality outputs: A guideline library approach for language models
Y Luo, Z Lin, Y Zhang, J Sun, C Lin, C Xu, X Su, Y Shen, J Guo, Y Gong
arXiv preprint arXiv:2403.11838, 2024
22024
Revolutionizing Database Q&A with Large Language Models: Comprehensive Benchmark and Evaluation
Y Zheng, B Li, Z Lin, Y Luo, X Zhou, C Lin, J Su, G Li, S Li
arXiv preprint arXiv:2409.04475, 2024
12024
Sigma: Differential Rescaling of Query, Key and Value for Efficient Language Models
Z Lin, Z Tang, X Liu, Y Gong, Y Cheng, Q Chen, H Li, Y Xin, Z Yang, ...
arXiv preprint arXiv:2501.13629, 2025
2025
Systém momentálně nemůže danou operaci provést. Zkuste to znovu později.
Články 1–9