Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey of controllable text generation using transformer-based pre-trained language models
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …
generation (NLG). It is regarded as crucial for the development of advanced text generation …
A survey on non-autoregressive generation for neural machine translation and beyond
Y ** for efficient bert pretraining
Non-autoregressive machine translation with disentangled context transformer
State-of-the-art neural machine translation models generate a translation from left to right
and every step is conditioned on the previously generated tokens. The sequential nature of …
and every step is conditioned on the previously generated tokens. The sequential nature of …
Improving non-autoregressive translation models without distillation
Transformer-based autoregressive (AR) machine translation models have achieved
significant performance improvements, nearing human-level accuracy on some languages …
significant performance improvements, nearing human-level accuracy on some languages …
Guiding non-autoregressive neural machine translation decoding with reordering information
Non-autoregressive neural machine translation (NAT) generates each target word in parallel
and has achieved promising inference acceleration. However, existing NAT models still …
and has achieved promising inference acceleration. However, existing NAT models still …
Transformers go for the LOLs: Generating (humourous) titles from scientific abstracts end-to-end
We consider the end-to-end abstract-to-title generation problem, exploring seven recent
transformer based models (including ChatGPT) fine-tuned on more than 30k abstract-title …
transformer based models (including ChatGPT) fine-tuned on more than 30k abstract-title …