Folgen
Quanyu Long
Quanyu Long
Bestätigte E-Mail-Adresse bei e.ntu.edu.sg - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Generative Imagination Elevates Machine Translation
Q Long, M Wang, L Li
NAACL 2021, 2021
432021
On the Robustness of Language Encoders against Grammatical Errors
F Yin, Q Long, T Meng, KW Chang
ACL 2020, 2020
332020
Backdoor attacks on dense passage retrievers for disseminating misinformation
Q Long, Y Deng, LL Gan, W Wang, S Jialin Pan
arXiv e-prints, arXiv: 2402.13532, 2024
242024
Domain Confused Contrastive Learning for Unsupervised Domain Adaptation
Q Long, T Luo, W Wang, S Pan
NAACL 2022, 2022
202022
Adapt in Contexts: Retrieval-Augmented Domain Adaptation via In-Context Learning
Q Long, W Wang, SJ Pan
EMNLP 2023, 6525--6542, 2023
112023
QA4IE: A question answering based system for document-level general information extraction
L Qiu, D Ru, Q Long, W Zhang, Y Yu
IEEE Access 8, 29677-29689, 2020
82020
Decomposing Label Space, Format and Discrimination: Rethinking How LLMs Respond and Solve Tasks via In-Context Learning
Q Long, Y Wu, W Wang, SJ Pan
arXiv preprint arXiv:2404.07546, 2024
52024
T2I-FactualBench: Benchmarking the Factuality of Text-to-Image Models with Knowledge-Intensive Concepts
Z Huang, W He, Q Long, Y Wang, H Li, Z Yu, F Shu, L Chen, H Jiang, ...
arXiv preprint arXiv:2412.04300, 2024
2024
Decomposition Dilemmas: Does Claim Decomposition Boost or Burden Fact-Checking Performance?
Q Hu, Q Long, W Wang
arXiv preprint arXiv:2411.02400, 2024
2024
Large Language Models Know What Makes Exemplary Contexts
Q Long, J Chen, W Wang, SJ Pan
arXiv preprint arXiv:2408.07505, 2024
2024
Does In-Context Learning Really Learn? Rethinking How Large Language Models Respond and Solve Tasks via In-Context Learning
Q Long, Y Wu, W Wang, SJ Pan
First Conference on Language Modeling, 2024
2024
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–11