Transformer: A general framework from machine translation to others

Y Zhao, J Zhang, C Zong - Machine Intelligence Research, 2023 - Springer
Abstract Machine translation is an important and challenging task that aims at automatically
translating natural language sentences from one language into another. Recently …

Neural machine translation: Challenges, progress and future

J Zhang, C Zong - Science China Technological Sciences, 2020 - Springer
Abstract Machine translation (MT) is a technique that leverages computers to translate
human languages automatically. Nowadays, neural machine translation (NMT) which …

Modeling localness for self-attention networks

B Yang, Z Tu, DF Wong, F Meng, LS Chao… - arxiv preprint arxiv …, 2018 - arxiv.org
Self-attention networks have proven to be of profound value for its strength of capturing
global dependencies. In this work, we propose to model localness for self-attention …

Learning to remember translation history with a continuous cache

Z Tu, Y Liu, S Shi, T Zhang - Transactions of the Association for …, 2018 - direct.mit.edu
Existing neural machine translation (NMT) models generally translate sentences in isolation,
missing the opportunity to take advantage of document-level information. In this work, we …

Context-aware self-attention networks for natural language processing

B Yang, L Wang, DF Wong, S Shi, Z Tu - Neurocomputing, 2021 - Elsevier
Abstract Recently, Self-Attention Networks (SANs) have shown its flexibility in parallel
computation and effectiveness of modeling both short-and long-term dependencies …

Towards neural phrase-based machine translation

PS Huang, C Wang, S Huang, D Zhou… - arxiv preprint arxiv …, 2017 - arxiv.org
In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method
explicitly models the phrase structures in output sequences using Sleep-WAke Networks …

Multi-granularity self-attention for neural machine translation

J Hao, X Wang, S Shi, J Zhang, Z Tu - arxiv preprint arxiv:1909.02222, 2019 - arxiv.org
Current state-of-the-art neural machine translation (NMT) uses a deep multi-head self-
attention network with no explicit phrase information. However, prior work on statistical …

Phrase2Vec: phrase embedding based on parsing

Y Wu, S Zhao, W Li - Information Sciences, 2020 - Elsevier
Text is one of the most common unstructured data, and usually, the most primary task in text
mining is to transfer the text into a structured representation. However, the existing text …

Incorporating statistical machine translation word knowledge into neural machine translation

X Wang, Z Tu, M Zhang - IEEE/ACM Transactions on Audio …, 2018 - ieeexplore.ieee.org
Neural machine translation (NMT) has gained more and more attention in recent years,
mainly due to its simplicity yet state-of-the-art performance. However, previous research has …

Google Translate vs. DeepL: analysing neural machine translation performance under the challenge of phraseological variation

CM Hidalgo-Ternero - 2020 - rua.ua.es
La presente investigación tiene por objetivo analizar el rendimiento de dos sistemas de
traducción automática neuronal (TAN)—Google Translate y DeepL—en la traducción (ES> …