Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Evaluation of text generation: A survey
The paper surveys evaluation methods of natural language generation (NLG) systems that
have been developed in the last few years. We group NLG evaluation methods into three …
have been developed in the last few years. We group NLG evaluation methods into three …
Optimus: Organizing sentences via pre-trained modeling of a latent space
When trained effectively, the Variational Autoencoder (VAE) can be both a powerful
generative model and an effective representation learning framework for natural language …
generative model and an effective representation learning framework for natural language …
Fast structured decoding for sequence models
Autoregressive sequence models achieve state-of-the-art performance in domains like
machine translation. However, due to the autoregressive factorization nature, these models …
machine translation. However, due to the autoregressive factorization nature, these models …
Music fadernets: Controllable music generation based on high-level features via low-level feature modelling
High-level musical qualities (such as emotion) are often abstract, subjective, and hard to
quantify. Given these difficulties, it is not easy to learn good feature representations with …
quantify. Given these difficulties, it is not easy to learn good feature representations with …
Planner: Generating diversified paragraph via latent language diffusion model
Autoregressive models for text sometimes generate repetitive and low-quality output
because errors accumulate during the steps of generation. This issue is often attributed to …
because errors accumulate during the steps of generation. This issue is often attributed to …
Paraphrase generation with latent bag of words
Paraphrase generation is a longstanding important problem in natural language processing.
Recent progress in deep generative models has shown promising results on discrete latent …
Recent progress in deep generative models has shown promising results on discrete latent …
Learning variational word masks to improve the interpretability of neural text classifiers
To build an interpretable neural text classifier, most of the prior work has focused on
designing inherently interpretable models or finding faithful explanations. A new line of work …
designing inherently interpretable models or finding faithful explanations. A new line of work …
To be closer: Learning to link up aspects with opinions
Dependency parse trees are helpful for discovering the opinion words in aspect-based
sentiment analysis (ABSA). However, the trees obtained from off-the-shelf dependency …
sentiment analysis (ABSA). However, the trees obtained from off-the-shelf dependency …
ARNOR: Attention regularization based noise reduction for distant supervision relation classification
Distant supervision is widely used in relation classification in order to create large-scale
training data by aligning a knowledge base with an unlabeled corpus. However, it also …
training data by aligning a knowledge base with an unlabeled corpus. However, it also …
Neural data-to-text generation via jointly learning the segmentation and correspondence
The neural attention model has achieved great success in data-to-text generation tasks.
Though usually excelling at producing fluent text, it suffers from the problem of information …
Though usually excelling at producing fluent text, it suffers from the problem of information …