Folgen
Se June Joo
Se June Joo
Bestätigte E-Mail-Adresse bei kaist.ac.kr
Titel
Zitiert von
Zitiert von
Jahr
The cot collection: Improving zero-shot and few-shot learning of language models via chain-of-thought fine-tuning
S Kim, SJ Joo, D Kim, J Jang, S Ye, J Shin, M Seo
arXiv preprint arXiv:2305.14045, 2023
802023
Mind the gap! injecting commonsense knowledge for abstractive dialogue summarization
S Kim, SJ Joo, H Chae, C Kim, S Hwang, J Yeo
arXiv preprint arXiv:2209.00930, 2022
212022
Latent action pretraining from videos
S Ye, J Jang, B Jeon, S Joo, J Yang, B Peng, A Mandlekar, R Tan, ...
arXiv preprint arXiv:2410.11758, 2024
92024
Cotever: Chain of thought prompting annotation toolkit for explanation verification
S Kim, SJ Joo, Y Jang, H Chae, J Yeo
arXiv preprint arXiv:2303.03628, 2023
82023
How Well Do Large Language Models Truly Ground?
H Lee, S Joo, C Kim, J Jang, D Kim, KW On, M Seo
arXiv preprint arXiv:2311.09069, 2023
52023
The BiGGen Bench: A Principled Benchmark for Fine-grained Evaluation of Language Models with Language Models
S Kim, J Suk, JY Cho, S Longpre, C Kim, D Yoon, G Son, Y Cho, ...
arXiv preprint arXiv:2406.05761, 2024
32024
Semiparametric Token-Sequence Co-Supervision
H Lee, D Kim, J Jun, S Joo, J Jang, KW On, M Seo
arXiv preprint arXiv:2403.09024, 2024
2024
SG-MLP: Switch Gated Multi-Layer Perceptron Model for Natural Language Understanding
G Son, S Kim, SJ Joo, W Cho, JE Nah
Proceedings of the Korea Information Processing Society Conference, 1116-1119, 2021
2021
자연어 처리를 위한 조건부 게이트 다층 퍼셉트론 모델 개발 및 구현
손규진, 김승원, 주세준, 조우진, 나정은
한국정보처리학회 학술대회논문집 28 (2), 1116-1119, 2021
2021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–9