팔로우
Tianhua Tao
Tianhua Tao
Paul G. Allen School of Computer Science & Engineering, University of Washington
cs.washington.edu의 이메일 확인됨
제목
인용
인용
연도
Language models meet world models: Embodied experiences enhance language models
J Xiang, T Tao, Y Gu, T Shu, Z Wang, Z Yang, Z Hu
Advances in neural information processing systems 36, 75392-75412, 2023
962023
Llm360: Towards fully transparent open-source llms
Z Liu, A Qiao, W Neiswanger, H Wang, B Tan, T Tao, J Li, Y Wang, S Sun, ...
arXiv preprint arXiv:2312.06550, 2023
562023
Slimpajama-dc: Understanding data combinations for llm training
Z Shen, T Tao, L Ma, W Neiswanger, Z Liu, H Wang, B Tan, J Hestness, ...
arXiv preprint arXiv:2309.10818, 2023
422023
Don't take it literally: An edit-invariant sequence loss for text generation
G Liu, Z Yang, T Tao, X Liang, J Bao, Z Li, X He, S Cui, Z Hu
arXiv preprint arXiv:2106.15078, 2021
242021
On the learning of non-autoregressive transformers
F Huang, T Tao, H Zhou, L Li, M Huang
International Conference on Machine Learning, 9356-9376, 2022
202022
Pandora: Towards General World Model with Natural Language Actions and Video States
J Xiang, G Liu, Y Gu, Q Gao, Y Ning, Y Zha, Z Feng, T Tao, S Hao, Y Shi, ...
arXiv preprint arXiv:2406.09455, 2024
182024
Scicode: A research coding benchmark curated by scientists
M Tian, L Gao, SD Zhang, X Chen, C Fan, X Guo, R Haas, P Ji, ...
arXiv preprint arXiv:2407.13168, 2024
82024
Web2Code: A Large-scale Webpage-to-Code Dataset and Evaluation Framework for Multimodal LLMs
S Yun, H Lin, R Thushara, MQ Bhat, Y Wang, Z Jiang, M Deng, J Wang, ...
arXiv preprint arXiv:2406.20098, 2024
62024
Mixture of experts enable efficient and effective protein understanding and design
N Sun, S Zou, T Tao, S Mahbub, D Li, Y Zhuang, H Wang, X Cheng, ...
bioRxiv, 2024.11. 29.625425, 2024
42024
Crystal: Illuminating LLM abilities on language and code
T Tao, J Li, B Tan, H Wang, W Marshall, BM Kanakiya, J Hestness, ...
arXiv preprint arXiv:2411.04156, 2024
32024
Accurate and general dna representations emerge from genome foundation models at scale
CN Ellington, N Sun, N Ho, T Tao, S Mahbub, D Li, Y Zhuang, H Wang, ...
bioRxiv, 2024.12. 01.625444, 2024
12024
Scaling dense representations for single cell with transcriptome-scale context
N Ho, CN Ellington, J Hou, S Addagudi, S Mo, T Tao, D Li, Y Zhuang, ...
bioRxiv, 2024.11. 28.625303, 2024
12024
A large-scale foundation model for rna function and structure prediction
S Zou, T Tao, S Mahbub, CN Ellington, RJ Algayres, D Li, Y Zhuang, ...
bioRxiv, 2024.11. 28.625345, 2024
12024
LLM360 K2-65B: Scaling Up Fully Transparent Open-Source LLMs
B Tan, H Wang37, W Neiswanger, T Tao, H Li, F Koto, Y Wang, S Sun, ...
12024
LLM360 K2: Building a 65B 360-Open-Source Large Language Model from Scratch
Z Liu, B Tan, H Wang, W Neiswanger, T Tao, H Li, F Koto, Y Wang, S Sun, ...
arXiv preprint arXiv:2501.07124, 2025
2025
LLM360 K2: Scaling Up 360-Open-Source Large Language Models
Z Liu, B Tan, H Wang, W Neiswanger, T Tao, H Li, F Koto, Y Wang, S Sun, ...
arXiv e-prints, arXiv: 2501.07124, 2025
2025
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–16