Tool learning with foundation models

Y Qin, S Hu, Y Lin, W Chen, N Ding, G Cui… - ACM Computing …, 2024 - dl.acm.org
Humans possess an extraordinary ability to create and utilize tools. With the advent of
foundation models, artificial intelligence systems have the potential to be equally adept in …

A survey of controllable text generation using transformer-based pre-trained language models

H Zhang, H Song, S Li, M Zhou, D Song - ACM Computing Surveys, 2023 - dl.acm.org
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …

An augmented benchmark dataset for geometric question answering through dual parallel text encoding

J Cao, J **ao - Proceedings of the 29th international conference …, 2022 - aclanthology.org
Automatic math problem solving has attracted much attention of NLP researchers recently.
However, most of the works focus on the solving of Math Word Problems (MWPs). In this …

Cue-CoT: Chain-of-thought prompting for responding to in-depth dialogue questions with LLMs

H Wang, R Wang, F Mi, Y Deng, Z Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
Large Language Models (LLMs), such as\texttt {ChatGPT}, greatly empower dialogue
systems with strong language understanding and generation capabilities. However, most of …

Writer-defined AI personas for on-demand feedback generation

K Benharrak, T Zindulka, F Lehmann, H Heuer… - Proceedings of the …, 2024 - dl.acm.org
Compelling writing is tailored to its audience. This is challenging, as writers may struggle to
empathize with readers, get feedback in time, or gain access to the target group. We …

Can LLM be a Personalized Judge?

YR Dong, T Hu, N Collier - arxiv preprint arxiv:2406.11657, 2024 - arxiv.org
Ensuring that large language models (LLMs) reflect diverse user values and preferences is
crucial as their user bases expand globally. It is therefore encouraging to see the growing …

Large language models as source planner for personalized knowledge-grounded dialogue

H Wang, M Hu, Y Deng, R Wang, F Mi, W Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
Open-domain dialogue system usually requires different sources of knowledge to generate
more informative and evidential responses. However, existing knowledge-grounded …

Unims-rag: A unified multi-source retrieval-augmented generation for personalized dialogue systems

H Wang, W Huang, Y Deng, R Wang, Z Wang… - arxiv preprint arxiv …, 2024 - arxiv.org
Large Language Models (LLMs) has shown exceptional capabilities in many natual
language understanding and generation tasks. However, the personalization issue still …

A model-agnostic data manipulation method for persona-based dialogue generation

Y Cao, W Bi, M Fang, S Shi, D Tao - arxiv preprint arxiv:2204.09867, 2022 - arxiv.org
Towards building intelligent dialogue agents, there has been a growing interest in
introducing explicit personas in generation models. However, with limited persona-based …