Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Modeling localness for self-attention networks
Self-attention networks have proven to be of profound value for its strength of capturing
global dependencies. In this work, we propose to model localness for self-attention …
global dependencies. In this work, we propose to model localness for self-attention …
Convolutional self-attention networks
Self-attention networks (SANs) have drawn increasing interest due to their high
parallelization in computation and flexibility in modeling dependencies. SANs can be further …
parallelization in computation and flexibility in modeling dependencies. SANs can be further …
Context-aware self-attention networks
Self-attention model has shown its flexibility in parallel computation and the effectiveness on
modeling both long-and short-term dependencies. However, it calculates the dependencies …
modeling both long-and short-term dependencies. However, it calculates the dependencies …
Integrating visuospatial, linguistic and commonsense structure into story visualization
While much research has been done in text-to-image synthesis, little work has been done to
explore the usage of linguistic structure of the input text. Such information is even more …
explore the usage of linguistic structure of the input text. Such information is even more …
Incorporating rich syntax information in Grammatical Error Correction
Abstract Syntax parse trees are a method of representing sentence structure and are often
used to provide models with syntax information and enhance downstream task performance …
used to provide models with syntax information and enhance downstream task performance …
Assessing the ability of self-attention networks to learn word order
Self-attention networks (SAN) have attracted a lot of interests due to their high parallelization
and strong performance on a variety of NLP tasks, eg machine translation. Due to the lack of …
and strong performance on a variety of NLP tasks, eg machine translation. Due to the lack of …
Neural machine translation with source-side latent graph parsing
This paper presents a novel neural machine translation model which jointly learns
translation and source-side latent graph representations of sentences. Unlike existing …
translation and source-side latent graph representations of sentences. Unlike existing …
Neural machine translation: A review and survey
F Stahlberg - arxiv preprint arxiv:1912.02047, 2019 - arxiv.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …
natural language into another, has experienced a major paradigm shift in recent years …
Improving neural machine translation with latent features feedback
Y Li, J Li, M Zhang - Neurocomputing, 2021 - Elsevier
Most state-of-the-art neural machine translation (NMT) models progressively encode feature
representation in a bottom-up feed-forward fashion. This traditional encoding mechanism …
representation in a bottom-up feed-forward fashion. This traditional encoding mechanism …
[HTML][HTML] Multi-source neural model for machine translation of agglutinative language
Y Pan, X Li, Y Yang, R Dong - Future Internet, 2020 - mdpi.com
Benefitting from the rapid development of artificial intelligence (AI) and deep learning, the
machine translation task based on neural networks has achieved impressive performance in …
machine translation task based on neural networks has achieved impressive performance in …