[HTML][HTML] Progress in machine translation
After more than 70 years of evolution, great achievements have been made in machine
translation. Especially in recent years, translation quality has been greatly improved with the …
translation. Especially in recent years, translation quality has been greatly improved with the …
[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Dense text retrieval based on pretrained language models: A survey
Text retrieval is a long-standing research topic on information seeking, where a system is
required to return relevant information resources to user's queries in natural language. From …
required to return relevant information resources to user's queries in natural language. From …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Multitask prompt tuning enables parameter-efficient transfer learning
Prompt tuning, in which a base pretrained model is adapted to each task via conditioning on
learned prompt vectors, has emerged as a promising approach for efficiently adapting large …
learned prompt vectors, has emerged as a promising approach for efficiently adapting large …
Speculative decoding with big little decoder
The recent emergence of Large Language Models based on the Transformer architecture
has enabled dramatic advancements in the field of Natural Language Processing. However …
has enabled dramatic advancements in the field of Natural Language Processing. However …
Aligning distillation for cold-start item recommendation
Recommending cold items in recommendation systems is a longstanding challenge due to
the inherent differences between warm items, which are recommended based on user …
the inherent differences between warm items, which are recommended based on user …
A survey on non-autoregressive generation for neural machine translation and beyond
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …
(NMT) to speed up inference, has attracted much attention in both machine learning and …
Redistributing low-frequency words: Making the most of monolingual data in non-autoregressive translation
Abstract Knowledge distillation (KD) is the preliminary step for training non-autoregressive
translation (NAT) models, which eases the training of NAT models at the cost of losing …
translation (NAT) models, which eases the training of NAT models at the cost of losing …
Deep encoder, shallow decoder: Reevaluating non-autoregressive machine translation
Much recent effort has been invested in non-autoregressive neural machine translation,
which appears to be an efficient alternative to state-of-the-art autoregressive machine …
which appears to be an efficient alternative to state-of-the-art autoregressive machine …