Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey of controllable text generation using transformer-based pre-trained language models
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …
generation (NLG). It is regarded as crucial for the development of advanced text generation …
Pre-trained language models for text generation: A survey
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
Lamda: Language models for dialog applications
We present LaMDA: Language Models for Dialog Applications. LaMDA is a family of
Transformer-based neural language models specialized for dialog, which have up to 137B …
Transformer-based neural language models specialized for dialog, which have up to 137B …
A survey of natural language generation
This article offers a comprehensive review of the research on Natural Language Generation
(NLG) over the past two decades, especially in relation to data-to-text generation and text-to …
(NLG) over the past two decades, especially in relation to data-to-text generation and text-to …
Recipes for building an open-domain chatbot
Building open-domain chatbots is a challenging area for machine learning research. While
prior work has shown that scaling neural models in the number of parameters and the size of …
prior work has shown that scaling neural models in the number of parameters and the size of …
Transformers: State-of-the-art natural language processing
Recent progress in natural language processing has been driven by advances in both
model architecture and model pretraining. Transformer architectures have facilitated …
model architecture and model pretraining. Transformer architectures have facilitated …
Dialogpt: Large-scale generative pre-training for conversational response generation
We present a large, tunable neural conversational response generation model, DialoGPT
(dialogue generative pre-trained transformer). Trained on 147M conversation-like …
(dialogue generative pre-trained transformer). Trained on 147M conversation-like …
[PDF][PDF] Language models are unsupervised multitask learners
Natural language processing tasks, such as question answering, machine translation,
reading comprehension, and summarization, are typically approached with supervised …
reading comprehension, and summarization, are typically approached with supervised …
Multi-task pre-training for plug-and-play task-oriented dialogue system
Pre-trained language models have been recently shown to benefit task-oriented dialogue
(TOD) systems. Despite their success, existing methods often formulate this task as a …
(TOD) systems. Despite their success, existing methods often formulate this task as a …
TOD-BERT: Pre-trained natural language understanding for task-oriented dialogue
The underlying difference of linguistic patterns between general text and task-oriented
dialogue makes existing pre-trained language models less useful in practice. In this work …
dialogue makes existing pre-trained language models less useful in practice. In this work …