Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Linguistic knowledge and transferability of contextual representations
Contextual word representations derived from large-scale neural language models are
successful across a diverse set of NLP tasks, suggesting that they encode useful and …
successful across a diverse set of NLP tasks, suggesting that they encode useful and …
On the linguistic representational power of neural machine translation models
Despite the recent success of deep neural networks in natural language processing and
other spheres of artificial intelligence, their interpretability remains a challenge. We analyze …
other spheres of artificial intelligence, their interpretability remains a challenge. We analyze …
Logiqa 2.0—an improved dataset for logical reasoning in natural language understanding
NLP research on logical reasoning regains momentum with the recent releases of a handful
of datasets, notably LogiQA and Reclor. Logical reasoning is exploited in many probing …
of datasets, notably LogiQA and Reclor. Logical reasoning is exploited in many probing …
An analysis of encoder representations in transformer-based machine translation
The attention mechanism is a successful technique in modern NLP, especially in tasks like
machine translation. The recently proposed network architecture of the Transformer is based …
machine translation. The recently proposed network architecture of the Transformer is based …
Exploring and predicting transferability across NLP tasks
Recent advances in NLP demonstrate the effectiveness of training large-scale language
models and transferring them to downstream tasks. Can fine-tuning these models on tasks …
models and transferring them to downstream tasks. Can fine-tuning these models on tasks …
A survey on narrative extraction from textual data
Narratives are present in many forms of human expression and can be understood as a
fundamental way of communication between people. Computational understanding of the …
fundamental way of communication between people. Computational understanding of the …
Compositionality in computational linguistics
Neural models greatly outperform grammar-based models across many tasks in modern
computational linguistics. This raises the question of whether linguistic principles, such as …
computational linguistics. This raises the question of whether linguistic principles, such as …
Analyzing individual neurons in pre-trained language models
While a lot of analysis has been carried to demonstrate linguistic knowledge captured by the
representations learned within deep NLP models, very little attention has been paid towards …
representations learned within deep NLP models, very little attention has been paid towards …
Designing a uniform meaning representation for natural language processing
In this paper we present Uniform Meaning Representation (UMR), a meaning representation
designed to annotate the semantic content of a text. UMR is primarily based on Abstract …
designed to annotate the semantic content of a text. UMR is primarily based on Abstract …
Discovering latent concepts learned in BERT
A large number of studies that analyze deep neural network models and their ability to
encode various linguistic and non-linguistic concepts provide an interpretation of the inner …
encode various linguistic and non-linguistic concepts provide an interpretation of the inner …