Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Interpreting deep learning models in natural language processing: A review
Neural network models have achieved state-of-the-art performances in a wide range of
natural language processing (NLP) tasks. However, a long-standing criticism against neural …
natural language processing (NLP) tasks. However, a long-standing criticism against neural …
Grammar prompting for domain-specific language generation with large language models
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …
from just a handful of in-context examples. However, for generating strings from highly …
Compositionality decomposed: How do neural networks generalise?
Despite a multitude of empirical studies, little consensus exists on whether neural networks
are able to generalise compositionally, a controversy that, in part, stems from a lack of …
are able to generalise compositionally, a controversy that, in part, stems from a lack of …
Discrete opinion tree induction for aspect-based sentiment analysis
Dependency trees have been intensively used with graph neural networks for aspect-based
sentiment classification. Though being effective, such methods rely on external dependency …
sentiment classification. Though being effective, such methods rely on external dependency …
Tree transformer: Integrating tree structures into self-attention
Pre-training Transformer from large-scale raw texts and fine-tuning on the desired task have
achieved state-of-the-art results on diverse NLP tasks. However, it is unclear what the …
achieved state-of-the-art results on diverse NLP tasks. However, it is unclear what the …
Compound probabilistic context-free grammars for grammar induction
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …
generated by a compound probabilistic context-free grammar. In contrast to traditional …
Uncertainty in natural language generation: From theory to applications
Recent advances of powerful Language Models have allowed Natural Language
Generation (NLG) to emerge as an important technology that can not only perform traditional …
Generation (NLG) to emerge as an important technology that can not only perform traditional …
Zero-shot 3d drug design by sketching and generating
Drug design is a crucial step in the drug discovery cycle. Recently, various deep learning-
based methods design drugs by generating novel molecules from scratch, avoiding …
based methods design drugs by generating novel molecules from scratch, avoiding …
Are pre-trained language models aware of phrases? simple but strong baselines for grammar induction
With the recent success and popularity of pre-trained language models (LMs) in natural
language processing, there has been a rise in efforts to understand their inner workings. In …
language processing, there has been a rise in efforts to understand their inner workings. In …
Assessing phrasal representation and composition in transformers
Deep transformer models have pushed performance on NLP tasks to new limits, suggesting
sophisticated treatment of complex linguistic inputs, such as phrases. However, we have …
sophisticated treatment of complex linguistic inputs, such as phrases. However, we have …