Semantic models for the first-stage retrieval: A comprehensive review

J Guo, Y Cai, Y Fan, F Sun, R Zhang… - ACM Transactions on …, 2022 - dl.acm.org
Multi-stage ranking pipelines have been a practical solution in modern search systems,
where the first-stage retrieval is to return a subset of candidate documents and latter stages …

Information retrieval: recent advances and beyond

KA Hambarde, H Proenca - IEEE Access, 2023 - ieeexplore.ieee.org
This paper provides an extensive and thorough overview of the models and techniques
utilized in the first and second stages of the typical information retrieval processing chain …

Colbertv2: Effective and efficient retrieval via lightweight late interaction

K Santhanam, O Khattab, J Saad-Falcon… - arxiv preprint arxiv …, 2021 - arxiv.org
Neural information retrieval (IR) has greatly advanced search and other knowledge-
intensive language tasks. While many neural IR methods encode queries and documents …

Promptagator: Few-shot dense retrieval from 8 examples

Z Dai, VY Zhao, J Ma, Y Luan, J Ni, J Lu… - arxiv preprint arxiv …, 2022 - arxiv.org
Much recent research on information retrieval has focused on how to transfer from one task
(typically with abundant supervised data) to various other tasks where supervision is limited …

Dense text retrieval based on pretrained language models: A survey

WX Zhao, J Liu, R Ren, JR Wen - ACM Transactions on Information …, 2024 - dl.acm.org
Text retrieval is a long-standing research topic on information seeking, where a system is
required to return relevant information resources to user's queries in natural language. From …

Autoregressive search engines: Generating substrings as document identifiers

M Bevilacqua, G Ottaviano, P Lewis… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Knowledge-intensive language tasks require NLP systems to both provide the
correct answer and retrieve supporting evidence for it in a given corpus. Autoregressive …

Unsupervised corpus aware language model pre-training for dense passage retrieval

L Gao, J Callan - arxiv preprint arxiv:2108.05540, 2021 - arxiv.org
Recent research demonstrates the effectiveness of using fine-tuned language models~(LM)
for dense retrieval. However, dense retrievers are hard to train, typically requiring heavily …

Rocketqav2: A joint training method for dense passage retrieval and passage re-ranking

R Ren, Y Qu, J Liu, WX Zhao, Q She, H Wu… - arxiv preprint arxiv …, 2021 - arxiv.org
In various natural language processing tasks, passage retrieval and passage re-ranking are
two key procedures in finding and ranking relevant information. Since both the two …

[BUCH][B] Pretrained transformers for text ranking: Bert and beyond

J Lin, R Nogueira, A Yates - 2022 - books.google.com
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …

From distillation to hard negative sampling: Making sparse neural ir models more effective

T Formal, C Lassance, B Piwowarski… - Proceedings of the 45th …, 2022 - dl.acm.org
Neural retrievers based on dense representations combined with Approximate Nearest
Neighbors search have recently received a lot of attention, owing their success to distillation …