Artykuły udostępnione publicznie: - Ning DingWięcej informacji
Dostępne w jakimś miejscu: 16
Parameter-efficient Fine-tuning of Large-scale Pre-trained Language Models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence, 2023
Upoważnienia: National Natural Science Foundation of China
PTR: Prompt Tuning with Rules for Text Classification
X Han, W Zhao, N Ding, Z Liu, M Sun
Preprint, 2022
Upoważnienia: National Natural Science Foundation of China
UltraFeedback: Boosting Language Models with Scaled AI Feedback
G Cui, L Yuan, N Ding, G Yao, W Zhu, Y Ni, G Xie, Z Liu, M Sun
ICML 2024, 2023
Upoważnienia: National Natural Science Foundation of China
Tool Learning with Foundation Models
Y Qin, S Hu, Y Lin, W Chen, N Ding, G Cui, Z Zeng, Y Huang, C Xiao, ...
ACM Computing Surveys, 2023
Upoważnienia: National Natural Science Foundation of China
Hierarchy-aware Global Model for Hierarchical Text Classification
J Zhou, C Ma, D Long, G Xu, N Ding, H Zhang, P Xie, G Liu
ACL 2020, 2020
Upoważnienia: National Natural Science Foundation of China
Chinese Relation Extraction with Multi-Grained Information and External Linguistic Knowledge
Z Li*, N Ding*, Z Liu, H Zheng, Y Shen
ACL 2019, 2019
Upoważnienia: National Natural Science Foundation of China
Event Detection with Trigger-Aware Lattice Neural Network
N Ding, Z Li, Z Liu, H Zheng, Z Lin
EMNLP 2019, 2019
Upoważnienia: National Natural Science Foundation of China
Sparse Structure Search for Parameter-Efficient Tuning
S Hu, Z Zhang, N Ding, Y Wang, Y Wang, Z Liu, M Sun
NeurIPS 2022, 2022
Upoważnienia: National Natural Science Foundation of China
Empowering Private Tutoring by Chaining Large Language Models
Y Chen*, N Ding*, HT Zheng, Z Liu, M Sun, B Zhou
CIKM 2024, 2023
Upoważnienia: National Natural Science Foundation of China
Infobox-to-text Generation with Tree-like Planning based Attention Network
Y Bai, Z Li, N Ding, Y Shen, HT Zheng
IJCAI 2020, 2020
Upoważnienia: National Natural Science Foundation of China
INTERVENOR: Prompt the Coding Ability of Large Language Models with the Interactive Chain of Repairing
H Wang, Z Liu, S Wang, G Cui, N Ding, Z Liu, G Yu
ACL 2024 Findings, 2023
Upoważnienia: National Natural Science Foundation of China
Unlock Predictable Scaling from Emergent Abilities
S Hu, X Liu, X Han, X Zhang, C He, W Zhao, Y Lin, N Ding, Z Ou, G Zeng, ...
ICLR 2024, 2023
Upoważnienia: National Natural Science Foundation of China
Parameter-efficient Weight Ensembling Facilitates Task-level Knowledge Transfer
X Lv, N Ding, Y Qin, Z Liu, M Sun
ACL 2023, 2023
Upoważnienia: National Natural Science Foundation of China
Integrating Linguistic Knowledge to Sentence Paraphrase Generation
Z Lin, Z Li, N Ding, HT Zheng, Y Shen, W Wang, CZ Zhao
AAAI 2020, 2020
Upoważnienia: National Natural Science Foundation of China
Generalized Local Aggregation for Large Scale Gaussian Process Regression
Y Gao, N Li, N Ding, Y Li, T Dai, ST Xia
IJCNN 2020, 2020
Upoważnienia: National Natural Science Foundation of China
Different Tunes Played with Equal Skill: Exploring a Unified Optimization Subspace for Delta Tuning
J Yi, W Chen, Y Qin, Y Lin, N Ding, X Han, Z Liu, M Sun, J Zhou
EMNLP 2022 Findings, 2022
Upoważnienia: National Natural Science Foundation of China
Informacje na temat publikacji i finansowania automatycznie określa program komputerowy