Követés
Di Zhang
Di Zhang
PhD Candidate at Fudan University
E-mail megerősítve itt: ustc.edu - Kezdőlap
Cím
Hivatkozott rá
Hivatkozott rá
Év
Chemllm: A chemical large language model
D Zhang, W Liu, Q Tan, J Chen, H Yan, Y Yan, J Li, W Huang, X Yue, ...
arXiv preprint arXiv:2402.06852, 2024
492024
Accessing gpt-4 level mathematical olympiad solutions via monte carlo tree self-refine with llama-3 8b
D Zhang, X Huang, D Zhou, Y Li, W Ouyang
arXiv preprint arXiv:2406.07394, 2024
322024
Llama-berry: Pairwise optimization for o1-like olympiad-level mathematical reasoning
D Zhang, J Wu, J Lei, T Che, J Li, T Xie, X Huang, S Zhang, M Pavone, ...
arXiv preprint arXiv:2410.02884, 2024
222024
Chemvlm: Exploring the power of multimodal large language models in chemistry area
J Li, D Zhang, X Wang, Z Hao, J Lei, Q Tan, C Zhou, W Liu, Y Yang, ...
arXiv preprint arXiv:2408.07246, 2024
6*2024
Critic-v: Vlm critics help catch vlm errors in multimodal reasoning
D Zhang, J Lei, J Li, X Wang, Y Liu, Z Yang, J Li, W Wang, S Yang, J Wu, ...
arXiv preprint arXiv:2411.18203, 2024
32024
Biology Instructions: A Dataset and Benchmark for Multi-Omics Sequence Understanding Capability of Large Language Models
H He, Y Ren, Y Tang, Z Xu, J Li, M Yang, D Zhang, D Yuan, T Chen, ...
arXiv preprint arXiv:2412.19191, 2024
2024
MolReFlect: Towards In-Context Fine-grained Alignments between Molecules and Texts
J Li, Y Liu, W Liu, J Le, D Zhang, W Fan, D Zhou, Y Li, Q Li
arXiv preprint arXiv:2411.14721, 2024
2024
A rendszer jelenleg nem tudja elvégezni a műveletet. Próbálkozzon újra később.
Cikkek 1–7