Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Transformer: A general framework from machine translation to others
Abstract Machine translation is an important and challenging task that aims at automatically
translating natural language sentences from one language into another. Recently …
translating natural language sentences from one language into another. Recently …
Neural machine translation: Challenges, progress and future
Abstract Machine translation (MT) is a technique that leverages computers to translate
human languages automatically. Nowadays, neural machine translation (NMT) which …
human languages automatically. Nowadays, neural machine translation (NMT) which …
Modeling localness for self-attention networks
Self-attention networks have proven to be of profound value for its strength of capturing
global dependencies. In this work, we propose to model localness for self-attention …
global dependencies. In this work, we propose to model localness for self-attention …
Learning to remember translation history with a continuous cache
Existing neural machine translation (NMT) models generally translate sentences in isolation,
missing the opportunity to take advantage of document-level information. In this work, we …
missing the opportunity to take advantage of document-level information. In this work, we …
Context-aware self-attention networks for natural language processing
Abstract Recently, Self-Attention Networks (SANs) have shown its flexibility in parallel
computation and effectiveness of modeling both short-and long-term dependencies …
computation and effectiveness of modeling both short-and long-term dependencies …
Towards neural phrase-based machine translation
In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method
explicitly models the phrase structures in output sequences using Sleep-WAke Networks …
explicitly models the phrase structures in output sequences using Sleep-WAke Networks …
Multi-granularity self-attention for neural machine translation
Current state-of-the-art neural machine translation (NMT) uses a deep multi-head self-
attention network with no explicit phrase information. However, prior work on statistical …
attention network with no explicit phrase information. However, prior work on statistical …
Phrase2Vec: phrase embedding based on parsing
Y Wu, S Zhao, W Li - Information Sciences, 2020 - Elsevier
Text is one of the most common unstructured data, and usually, the most primary task in text
mining is to transfer the text into a structured representation. However, the existing text …
mining is to transfer the text into a structured representation. However, the existing text …
Incorporating statistical machine translation word knowledge into neural machine translation
Neural machine translation (NMT) has gained more and more attention in recent years,
mainly due to its simplicity yet state-of-the-art performance. However, previous research has …
mainly due to its simplicity yet state-of-the-art performance. However, previous research has …
Google Translate vs. DeepL: analysing neural machine translation performance under the challenge of phraseological variation
CM Hidalgo-Ternero - 2020 - rua.ua.es
La presente investigación tiene por objetivo analizar el rendimiento de dos sistemas de
traducción automática neuronal (TAN)—Google Translate y DeepL—en la traducción (ES> …
traducción automática neuronal (TAN)—Google Translate y DeepL—en la traducción (ES> …