Follow
Yuanchun Wang
Yuanchun Wang
Other namesWANG Yuanchun(王元淳)
Verified email at ruc.edu.cn - Homepage
Title
Cited by
Cited by
Year
GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model
S Tan, WL Tam, Y Wang, W Gong, Y Yang, H Tang, K He, J Liu, J Wang, ...
The 61st Annual Meeting Of The Association For Computational Linguistics …, 2023
132023
R-Eval: A Unified Toolkit for Evaluating Domain Knowledge of Retrieval Augmented Large Language Models
S Tu, Y Wang, J Yu, Y Xie, Y Shi, X Wang, J Zhang, L Hou, J Li
72024
From MOOC to MAIC: Reshaping Online Teaching and Learning through LLM-driven Agents
J Yu, Z Zhang, D Zhang-li, S Tu, Z Hao, RM Li, H Li, Y Wang, H Li, L Gong, ...
arXiv preprint arXiv:2409.03512, 2024
32024
A Solution-based LLM API-using Methodology for Academic Information Seeking
Y Wang, J Yu, Z Yao, J Zhang, Y Xie, S Tu, Y Fu, Y Feng, J Zhang, ...
arXiv preprint arXiv:2405.15165, 2024
22024
NHGMI: Heterogeneous graph multi-view infomax with node-wise contrasting samples selection
Q Li, H Ni, Y Wang
Knowledge-Based Systems 289, 111520, 2024
22024
Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method
S Tan, WL Tam, Y Wang, W Gong, S Zhao, P Zhang, J Tang
arXiv preprint arXiv:2306.06625, 2023
22023
[Industry] GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model
S Tan, WL Tam, Y Wang, W Gong, S Zhao, P Zhang, J Tang
The 61st Annual Meeting Of The Association For Computational Linguistics, 2023
12023
Authorship style transfer with inverse transfer data augmentation
Z Shao, J Zhang, H Li, X Huang, C Zhou, Y Wang, J Gong, C Li, H Chen
AI Open 5, 94-103, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–8