Prati
Yue Wang
Yue Wang
Creator of CodeT5, co-creator of CodeRL and Aria
Potvrđena adresa e-pošte na salesforce.com - Početna stranica
Naslov
Citirano
Citirano
Godina
CodeT5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation
Y Wang, W Wang, S Joty, SCH Hoi
EMNLP 2021, 2021
15752021
CodeT5+: Open code large language models for code understanding and generation
Y Wang*, H Le*, AD Gotmare, NDQ Bui, J Li, SCH Hoi
EMNLP 2023, 1069--1088, 2023
4422023
Yi: Open foundation models by 01. ai
A Young, B Chen, C Li, C Huang, G Zhang, G Zhang, H Li, J Zhu, J Chen, ...
arXiv preprint arXiv:2403.04652, 2024
3922024
CodeRL: Mastering code generation through pretrained models and deep reinforcement learning
H Le*, Y Wang*, AD Gotmare, S Savarese, SCH Hoi
NeurIPS 2022 35, 21314-21328, 2022
3372022
Code completion with neural attention and pointer networks
J Li, Y Wang, MR Lyu, I King
IJCAI 2018, 2017
2862017
Topic-aware neural keyphrase generation for social media language
Y Wang, J Li, HP Chan, I King, MR Lyu, S Shi
ACL 2019, 2019
1042019
VD-BERT: A unified vision and dialog transformer with BERT
Y Wang, S Joty, MR Lyu, I King, C Xiong, SCH Hoi
EMNLP 2020, 2020
1032020
RAP-Gen: Retrieval-augmented patch generation with CodeT5 for automatic program repair
W Wang*, Y Wang*, S Joty, SCH Hoi
FSE 2023, 146-158, 2023
592023
Microblog Hashtag Generation via Encoding Conversation Contexts
Yue Wang, Jing Li, Irwin King, Michael R. Lyu, Shuming Shi
NAACL-HLT 2019, 1624–1633, 2019
37*2019
Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation. arXiv 2021
Y Wang, W Wang, S Joty, SC Hoi
arXiv preprint arXiv:2109.00859, 0
30
Codetf: One-stop transformer library for state-of-the-art code llm
NDQ Bui, H Le, Y Wang, J Li, AD Gotmare, SCH Hoi
arXiv preprint arXiv:2306.00029, 2023
242023
Unified vision and dialogue transformer with BERT
Y Wang, CH Hoi, SR Joty
US Patent 11,562,147, 2023
242023
Detect-Localize-Repair: A Unified Framework for Learning to Debug with CodeT5
NDQ Bui, Y Wang, S Hoi
EMNLP 2022 Findings, 2022
242022
Machine translation verbosity control for automatic dubbing
SM Lakew, M Federico, Y Wang, C Hoang, Y Virkar, R Barra-Chicote, ...
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
232021
Aria: An open multimodal native mixture-of-experts model
D Li, Y Liu, H Wu, Y Wang, Z Shen, B Qu, X Niu, G Wang, B Chen, J Li
arXiv preprint arXiv:2410.05993, 2024
212024
Cross-Media Keyphrase Prediction: A Unified Framework with Multi-Modality Multi-Head Attention and Image Wordings
Y Wang, J Li, MR Lyu, I King
EMNLP 2020, 2020
192020
Towards Modeling the Style of Translators in Neural Machine Translation
Y Wang, C Hoang, M Federico
NAACL 2021, 2021
142021
Yi: Open Foundation Models by 01. AI
AY AI, B Chen, C Li, C Huang, G Zhang, G Zhang, H Li, J Zhu, J Chen, ...
arXiv preprint arXiv:2403.04652, 2024
12*2024
Codeeditorbench: Evaluating code editing capability of large language models
J Guo, Z Li, X Liu, K Ma, T Zheng, Z Yu, D Pan, Y Li, R Liu, Y Wang, S Guo, ...
arXiv preprint arXiv:2404.03543, 2024
92024
Autokaggle: A multi-agent framework for autonomous data science competitions
Z Li, Q Zang, D Ma, J Guo, T Zheng, M Liu, X Niu, Y Wang, J Yang, J Liu, ...
arXiv preprint arXiv:2410.20424, 2024
72024
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–20