Takip et
Yao Fu
Yao Fu
Google DeepMind
google.com üzerinde doğrulanmış e-posta adresine sahip - Ana Sayfa
Başlık
Alıntı yapanlar
Alıntı yapanlar
Yıl
C-eval: A multi-level multi-discipline chinese evaluation suite for foundation models
Y Huang, Y Bai, Z Zhu, J Zhang, J Zhang, T Su, J Liu, C Lv, Y Zhang, Y Fu, ...
Advances in Neural Information Processing Systems 36, 2024
385*2024
Decomposed prompting: A modular approach for solving complex tasks
T Khot, H Trivedi, M Finlayson, Y Fu, K Richardson, P Clark, A Sabharwal
arXiv preprint arXiv:2210.02406, 2022
3812022
Complexity-based prompting for multi-step reasoning
Y Fu, H Peng, A Sabharwal, P Clark, T Khot
The Eleventh International Conference on Learning Representations, 2022
3502022
Mammoth: Building math generalist models through hybrid instruction tuning
X Yue, X Qu, G Zhang, Y Fu, W Huang, H Sun, Y Su, W Chen
arXiv preprint arXiv:2309.05653, 2023
2652023
Specializing smaller language models towards multi-step reasoning
Y Fu, H Peng, L Ou, A Sabharwal, T Khot
International Conference on Machine Learning, 10421-10430, 2023
2082023
Improving language model negotiation with self-play and in-context learning from ai feedback
Y Fu, H Peng, T Khot, M Lapata
arXiv preprint arXiv:2305.10142, 2023
1312023
Paraphrase generation with latent bag of words
Y Fu, Y Feng, JP Cunningham
Advances in Neural Information Processing Systems 32, 2019
1022019
Data engineering for scaling language models to 128k context
Y Fu, R Panda, X Niu, X Yue, H Hajishirzi, Y Kim, H Peng
arXiv preprint arXiv:2402.10171, 2024
702024
To repeat or not to repeat: Insights from scaling llm under token-crisis
F Xue, Y Fu, W Zhou, Z Zheng, Y You
Advances in Neural Information Processing Systems 36, 2024
702024
Prototypical representation learning for relation extraction
N Ding, X Wang, Y Fu, G Xu, R Wang, P Xie, Y Shen, F Huang, HT Zheng, ...
arXiv preprint arXiv:2103.11647, 2021
692021
Noisy-labeled NER with confidence estimation
K Liu, Y Fu, C Tan, M Chen, N Zhang, S Huang, S Gao
arXiv preprint arXiv:2104.04318, 2021
622021
How does GPT Obtain its Ability? Tracing Emergent Abilities of Language Models to their Sources
Y Fu, H Peng, T Khot
https://yaofu.notion.site/How-does-GPT-Obtain-its-Ability-Tracing-Emergent …, 2022
602022
Openmoe: An early effort on open mixture-of-experts language models
F Xue, Z Zheng, Y Fu, J Ni, Z Zheng, W Zhou, Y You
arXiv preprint arXiv:2402.01739, 2024
562024
Probing BERT in hyperbolic spaces
B Chen, Y Fu, G Xu, P Xie, C Tan, M Chen, L Jing
arXiv preprint arXiv:2104.03869, 2021
532021
Nested named entity recognition with partially-observed treecrfs
Y Fu, C Tan, M Chen, S Huang, F Huang
Proceedings of the AAAI Conference on Artificial Intelligence 35 (14), 12839 …, 2021
522021
Chain-of-Thought Hub: A Continuous Effort to Measure Large Language Models' Reasoning Performance
Y Fu, L Ou, M Chen, Y Wan, H Peng, T Khot
arXiv preprint arXiv:2305.17306, 2023
49*2023
Natural answer generation with heterogeneous memory
Y Fu, Y Feng
Proceedings of the 2018 Conference of the North American Chapter of the …, 2018
392018
Retrieval head mechanistically explains long-context factuality
W Wu, Y Wang, G Xiao, H Peng, Y Fu
arXiv preprint arXiv:2404.15574, 2024
312024
Data-to-text generation with variational sequential planning
R Puduppully, Y Fu, M Lapata
Transactions of the Association for Computational Linguistics 10, 697-715, 2022
302022
Rethinking text attribute transfer: A lexical analysis
Y Fu, H Zhou, J Chen, L Li
arXiv preprint arXiv:1909.12335, 2019
212019
Sistem, işlemi şu anda gerçekleştiremiyor. Daha sonra yeniden deneyin.
Makaleler 1–20