Retrieving and reading: A comprehensive survey on open-domain question answering

F Zhu, W Lei, C Wang, J Zheng, S Poria… - arxiv preprint arxiv …, 2021 - arxiv.org
Open-domain Question Answering (OpenQA) is an important task in Natural Language
Processing (NLP), which aims to answer a question in the form of natural language based …

Conversational question answering: A survey

M Zaib, WE Zhang, QZ Sheng, A Mahmood… - … and Information Systems, 2022 - Springer
Question answering (QA) systems provide a way of querying the information available in
various formats including, but not limited to, unstructured and structured data in natural …

Interleaving retrieval with chain-of-thought reasoning for knowledge-intensive multi-step questions

H Trivedi, N Balasubramanian, T Khot… - arxiv preprint arxiv …, 2022 - arxiv.org
Prompting-based large language models (LLMs) are surprisingly powerful at generating
natural language reasoning steps or Chains-of-Thoughts (CoT) for multi-step question …

Dspy: Compiling declarative language model calls into self-improving pipelines

O Khattab, A Singhvi, P Maheshwari, Z Zhang… - arxiv preprint arxiv …, 2023 - arxiv.org
The ML community is rapidly exploring techniques for prompting language models (LMs)
and for stacking them into pipelines that solve complex tasks. Unfortunately, existing LM …

Learning to retrieve reasoning paths over wikipedia graph for question answering

A Asai, K Hashimoto, H Hajishirzi, R Socher… - arxiv preprint arxiv …, 2019 - arxiv.org
Answering questions that require multi-hop reasoning at web-scale necessitates retrieving
multiple evidence documents, one of which often has little lexical or semantic relationship to …

Open question answering over tables and text

W Chen, MW Chang, E Schlinger, W Wang… - arxiv preprint arxiv …, 2020 - arxiv.org
In open question answering (QA), the answer to a question is produced by retrieving and
then analyzing documents that might contain answers to the question. Most open QA …

Iteratively prompt pre-trained language models for chain of thought

B Wang, X Deng, H Sun - arxiv preprint arxiv:2203.08383, 2022 - arxiv.org
While Pre-trained Language Models (PLMs) internalize a great amount of world knowledge,
they have been shown incapable of recalling these knowledge to solve tasks requiring …

Hierarchical graph network for multi-hop question answering

Y Fang, S Sun, Z Gan, R Pillai, S Wang… - arxiv preprint arxiv …, 2019 - arxiv.org
In this paper, we present Hierarchical Graph Network (HGN) for multi-hop question
answering. To aggregate clues from scattered texts across multiple paragraphs, a …

The NLP cookbook: modern recipes for transformer based deep learning architectures

S Singh, A Mahmood - IEEE Access, 2021 - ieeexplore.ieee.org
In recent years, Natural Language Processing (NLP) models have achieved phenomenal
success in linguistic and semantic tasks like text classification, machine translation, cognitive …

Break It Down: A Question Understanding Benchmark

T Wolfson, M Geva, A Gupta, M Gardner… - Transactions of the …, 2020 - direct.mit.edu
Understanding natural language questions entails the ability to break down a question into
the requisite steps for computing its answer. In this work, we introduce a Question …