Retrieval augmented zero-shot text classification

T Abdullahi, R Singh, C Eickhoff - Proceedings of the 2024 ACM SIGIR …, 2024‏ - dl.acm.org
Zero-shot text learning enables text classifiers to handle unseen classes efficiently,
alleviating the need for task-specific training data. A simple approach often relies on …

Augmenting passage representations with query generation for enhanced cross-lingual dense retrieval

S Zhuang, L Shou, G Zuccon - Proceedings of the 46th International ACM …, 2023‏ - dl.acm.org
Effective cross-lingual dense retrieval methods that rely on multilingual pre-trained language
models (PLMs) need to be trained to encompass both the relevance matching task and the …

KEIR@ ECIR 2024: The first workshop on knowledge-enhanced information retrieval

Z Meng, S Liang, X **n, G Moro, E Kanoulas… - … on Information Retrieval, 2024‏ - Springer
The infusion of external knowledge bases into IR models can provide enhanced ranking
results and greater interpretability, offering substantial advancements in the field. The first …

ReFIT: Relevance Feedback from a Reranker during Inference

RG Reddy, P Dasigi, MA Sultan, A Cohan, A Sil… - arxiv preprint arxiv …, 2023‏ - arxiv.org
Retrieve-and-rerank is a prevalent framework in neural information retrieval, wherein a bi-
encoder network initially retrieves a pre-defined number of candidates (eg, K= 100), which …

Online Distillation for Pseudo-Relevance Feedback

S MacAvaney, X Wang - arxiv preprint arxiv:2306.09657, 2023‏ - arxiv.org
Model distillation has emerged as a prominent technique to improve neural search models.
To date, distillation taken an offline approach, wherein a new neural model is trained to …

ReFIT: Reranker Relevance Feedback during Inference

RG Reddy, P Dasigi, MA Sultan, A Cohan, A Sil, H Ji‏ - sites.computer.org
Retrieve-and-rerank is a prevalent framework in neural information retrieval, wherein a bi-
encoder network initially retrieves a pre-defined number of candidates (eg, K= 100), which …