[BOG][B] Pretrained transformers for text ranking: Bert and beyond
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …
response to a query. Although the most common formulation of text ranking is search …
Document ranking with a pretrained sequence-to-sequence model
This work proposes a novel adaptation of a pretrained sequence-to-sequence model to the
task of document ranking. Our approach is fundamentally different from a commonly …
task of document ranking. Our approach is fundamentally different from a commonly …
Passage Re-ranking with BERT
Recently, neural models pretrained on a language modeling task, such as ELMo (Peters et
al., 2017), OpenAI GPT (Radford et al., 2018), and BERT (Devlin et al., 2018), have achieved …
al., 2017), OpenAI GPT (Radford et al., 2018), and BERT (Devlin et al., 2018), have achieved …
Utilizing BERT for Information Retrieval: Survey, Applications, Resources, and Challenges
Recent years have witnessed a substantial increase in the use of deep learning to solve
various natural language processing (NLP) problems. Early deep learning models were …
various natural language processing (NLP) problems. Early deep learning models were …
Multi-stage document ranking with BERT
The advent of deep neural networks pre-trained via language modeling tasks has spurred a
number of successful applications in natural language processing. This work explores one …
number of successful applications in natural language processing. This work explores one …
CEDR: Contextualized embeddings for document ranking
Although considerable attention has been given to neural ranking architectures recently, far
less attention has been paid to the term representations that are used as input to these …
less attention has been paid to the term representations that are used as input to these …
A deep look into neural ranking models for information retrieval
Ranking models lie at the heart of research on information retrieval (IR). During the past
decades, different techniques have been proposed for constructing ranking models, from …
decades, different techniques have been proposed for constructing ranking models, from …
PARADE: Passage Representation Aggregation forDocument Reranking
Pre-trained transformer models, such as BERT and T5, have shown to be highly effective at
ad hoc passage and document ranking. Due to the inherent sequence length limits of these …
ad hoc passage and document ranking. Due to the inherent sequence length limits of these …
Rethinking search: making domain experts out of dilettantes
When experiencing an information need, users want to engage with a domain expert, but
often turn to an information retrieval system, such as a search engine, instead. Classical …
often turn to an information retrieval system, such as a search engine, instead. Classical …
An introduction to neural information retrieval
Neural ranking models for information retrieval (IR) use shallow or deep neural networks to
rank search results in response to a query. Traditional learning to rank models employ …
rank search results in response to a query. Traditional learning to rank models employ …