Contrastive learning-enhanced nearest neighbor mechanism for multi-label text classification

R Wang, X Dai - Proceedings of the 60th Annual Meeting of the …, 2022 - aclanthology.org
Abstract Multi-Label Text Classification (MLTC) is a fundamental and challenging task in
natural language processing. Previous studies mainly focus on learning text representation …

Domain adaptation and multi-domain adaptation for neural machine translation: A survey

D Saunders - Journal of Artificial Intelligence Research, 2022 - jair.org
The development of deep learning techniques has allowed Neural Machine Translation
(NMT) models to become extremely powerful, given sufficient training data and training time …

FEDS-ICL: Enhancing translation ability and efficiency of large language model by optimizing demonstration selection

S Zhu, L Pan, D **ong - Information Processing & Management, 2024 - Elsevier
Large language models (LLMs) that exhibit a remarkable ability by in-context learning (ICL)
with bilingual demonstrations have been recognized as a potential solution for machine …

[HTML][HTML] Hierarchical text classification with multi-label contrastive learning and KNN

J Zhang, Y Li, F Shen, Y He, H Tan, Y He - Neurocomputing, 2024 - Elsevier
Given the complicated label hierarchy, hierarchical text classification (HTC) has emerged as
a challenging subtask in the realm of multi-label text classification. Existing methods …

Efficient cluster-based k-nearest-neighbor machine translation

D Wang, K Fan, B Chen, D **ong - arxiv preprint arxiv:2204.06175, 2022 - arxiv.org
k-Nearest-Neighbor Machine Translation (kNN-MT) has been recently proposed as a non-
parametric solution for domain adaptation in neural machine translation (NMT). It aims to …

Chunk-based nearest neighbor machine translation

PH Martins, Z Marinho, AFT Martins - arxiv preprint arxiv:2205.12230, 2022 - arxiv.org
Semi-parametric models, which augment generation with retrieval, have led to impressive
results in language modeling and machine translation, due to their ability to retrieve fine …

Improving few-shot performance of language models via nearest neighbor calibration

F Nie, M Chen, Z Zhang, X Cheng - arxiv preprint arxiv:2212.02216, 2022 - arxiv.org
Pre-trained language models (PLMs) have exhibited remarkable few-shot learning
capabilities when provided a few examples in a natural language prompt as demonstrations …