Retrieving and reading: A comprehensive survey on open-domain question answering
Open-domain Question Answering (OpenQA) is an important task in Natural Language
Processing (NLP), which aims to answer a question in the form of natural language based …
Processing (NLP), which aims to answer a question in the form of natural language based …
Conversational question answering: A survey
Question answering (QA) systems provide a way of querying the information available in
various formats including, but not limited to, unstructured and structured data in natural …
various formats including, but not limited to, unstructured and structured data in natural …
Interleaving retrieval with chain-of-thought reasoning for knowledge-intensive multi-step questions
Prompting-based large language models (LLMs) are surprisingly powerful at generating
natural language reasoning steps or Chains-of-Thoughts (CoT) for multi-step question …
natural language reasoning steps or Chains-of-Thoughts (CoT) for multi-step question …
Dspy: Compiling declarative language model calls into self-improving pipelines
The ML community is rapidly exploring techniques for prompting language models (LMs)
and for stacking them into pipelines that solve complex tasks. Unfortunately, existing LM …
and for stacking them into pipelines that solve complex tasks. Unfortunately, existing LM …
Learning to retrieve reasoning paths over wikipedia graph for question answering
Answering questions that require multi-hop reasoning at web-scale necessitates retrieving
multiple evidence documents, one of which often has little lexical or semantic relationship to …
multiple evidence documents, one of which often has little lexical or semantic relationship to …
Open question answering over tables and text
In open question answering (QA), the answer to a question is produced by retrieving and
then analyzing documents that might contain answers to the question. Most open QA …
then analyzing documents that might contain answers to the question. Most open QA …
Iteratively prompt pre-trained language models for chain of thought
While Pre-trained Language Models (PLMs) internalize a great amount of world knowledge,
they have been shown incapable of recalling these knowledge to solve tasks requiring …
they have been shown incapable of recalling these knowledge to solve tasks requiring …
Hierarchical graph network for multi-hop question answering
In this paper, we present Hierarchical Graph Network (HGN) for multi-hop question
answering. To aggregate clues from scattered texts across multiple paragraphs, a …
answering. To aggregate clues from scattered texts across multiple paragraphs, a …
The NLP cookbook: modern recipes for transformer based deep learning architectures
In recent years, Natural Language Processing (NLP) models have achieved phenomenal
success in linguistic and semantic tasks like text classification, machine translation, cognitive …
success in linguistic and semantic tasks like text classification, machine translation, cognitive …
Break It Down: A Question Understanding Benchmark
Understanding natural language questions entails the ability to break down a question into
the requisite steps for computing its answer. In this work, we introduce a Question …
the requisite steps for computing its answer. In this work, we introduce a Question …