Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Are sixteen heads really better than one?
Multi-headed attention is a driving force behind recent state-of-the-art NLP models. By
applying multiple attention mechanisms in parallel, it can express sophisticated functions …
applying multiple attention mechanisms in parallel, it can express sophisticated functions …
Mask-predict: Parallel decoding of conditional masked language models
Most machine translation systems generate text autoregressively from left to right. We,
instead, use a masked language modeling objective to train a model to predict any subset of …
instead, use a masked language modeling objective to train a model to predict any subset of …
Non-autoregressive machine translation with latent alignments
This paper presents two strong methods, CTC and Imputer, for non-autoregressive machine
translation that model latent alignments with dynamic programming. We revisit CTC for …
translation that model latent alignments with dynamic programming. We revisit CTC for …
Beyond BLEU: training neural machine translation with semantic similarity
While most neural machine translation (NMT) systems are still trained using maximum
likelihood estimation, recent work has demonstrated that optimizing systems to directly …
likelihood estimation, recent work has demonstrated that optimizing systems to directly …
Unsupervised multimodal machine translation for low-resource distant language pairs
Unsupervised machine translation (UMT) has recently attracted more attention from
researchers, enabling models to translate when languages lack parallel corpora. However …
researchers, enabling models to translate when languages lack parallel corpora. However …
Very deep transformers for neural machine translation
We explore the application of very deep Transformer models for Neural Machine Translation
(NMT). Using a simple yet effective initialization technique that stabilizes training, we show …
(NMT). Using a simple yet effective initialization technique that stabilizes training, we show …
Aligned cross entropy for non-autoregressive machine translation
Non-autoregressive machine translation models significantly speed up decoding by
allowing for parallel prediction of the entire target sequence. However, modeling word order …
allowing for parallel prediction of the entire target sequence. However, modeling word order …
Fixed encoder self-attention patterns in transformer-based machine translation
Transformer-based models have brought a radical change to neural machine translation. A
key feature of the Transformer architecture is the so-called multi-head attention mechanism …
key feature of the Transformer architecture is the so-called multi-head attention mechanism …
Dinoiser: Diffused conditional sequence learning by manipulating noises
While diffusion models have achieved great success in generating continuous signals such
as images and audio, it remains elusive for diffusion models in learning discrete sequence …
as images and audio, it remains elusive for diffusion models in learning discrete sequence …