Підписатись
Yulin Chen
Yulin Chen
Підтверджена електронна адреса в mails.tsinghua.edu.cn
Назва
Посилання
Посилання
Рік
Parameter-efficient fine-tuning of large-scale pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence 5 (3), 220-235, 2023
6882023
Enhancing chat language models by scaling high-quality instructional conversations
N Ding, Y Chen, B Xu, Y Qin, Z Zheng, S Hu, Z Liu, M Sun, B Zhou
arXiv preprint arXiv:2305.14233, 2023
3492023
Openprompt: An open-source framework for prompt-learning
N Ding, S Hu, W Zhao, Y Chen, Z Liu, HT Zheng, M Sun
arXiv preprint arXiv:2111.01998, 2021
3142021
Few-nerd: A few-shot named entity recognition dataset
N Ding, G Xu, Y Chen, X Wang, X Han, P Xie, HT Zheng, Z Liu
arXiv preprint arXiv:2105.07464, 2021
2422021
Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
arXiv preprint arXiv:2203.06904, 2022
2332022
Prompt-learning for fine-grained entity typing
N Ding, Y Chen, X Han, G Xu, P Xie, HT Zheng, Z Liu, J Li, HG Kim
arXiv preprint arXiv:2108.10604, 2021
1692021
Sparse low-rank adaptation of pre-trained language models
N Ding, X Lv, Q Wang, Y Chen, B Zhou, Z Liu, M Sun
arXiv preprint arXiv:2311.11696, 2023
712023
Maven-ere: A unified large-scale dataset for event coreference, temporal, causal, and subevent relation extraction
X Wang, Y Chen, N Ding, H Peng, Z Wang, Y Lin, X Han, L Hou, J Li, Z Liu, ...
arXiv preprint arXiv:2211.07342, 2022
512022
Empowering private tutoring by chaining large language models
Y Chen, N Ding, HT Zheng, Z Liu, M Sun, B Zhou
Proceedings of the 33rd ACM International Conference on Information and …, 2024
202024
Exploring lottery prompts for pre-trained language models
Y Chen, N Ding, X Wang, S Hu, HT Zheng, Z Liu, P Xie
arXiv preprint arXiv:2305.19500, 2023
102023
Few-shot classification with hypersphere modeling of prototypes
N Ding, Y Chen, G Cui, X Wang, HT Zheng, Z Liu, P Xie
arXiv preprint arXiv:2211.05319, 2022
92022
У даний момент система не може виконати операцію. Спробуйте пізніше.
Статті 1–11