Introducing neural bag of whole-words with colberter: Contextualized late interactions using enhanced reduction

S Hofstätter, O Khattab, S Althammer… - Proceedings of the 31st …, 2022 - dl.acm.org
Recent progress in neural information retrieval has demonstrated large gains in quality,
while often sacrificing efficiency and interpretability compared to classical approaches. We …

Efficient neural ranking using forward indexes and lightweight encoders

J Leonhardt, H Müller, K Rudra, M Khosla… - ACM Transactions on …, 2024 - dl.acm.org
Dual-encoder-based dense retrieval models have become the standard in IR. They employ
large Transformer-based language models, which are notoriously inefficient in terms of …

On the interpolation of contextualized term-based ranking with bm25 for query-by-example retrieval

A Abolghasemi, A Askari, S Verberne - Proceedings of the 2022 ACM …, 2022 - dl.acm.org
Term-based ranking with pre-trained transformer-based language models has recently
gained attention as they bring the contextualization power of transformer models into the …

Efficient neural ranking using forward indexes

J Leonhardt, K Rudra, M Khosla, A Anand… - Proceedings of the ACM …, 2022 - dl.acm.org
Neural document ranking approaches, specifically transformer models, have achieved
impressive gains in ranking performance. However, query processing using such over …