Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Self-supervised speech representation learning: A review
Although supervised deep learning has revolutionized speech and audio processing, it has
necessitated the building of specialist models for individual tasks and application scenarios …
necessitated the building of specialist models for individual tasks and application scenarios …
Neural machine translation for low-resource languages: A survey
S Ranathunga, ESA Lee, M Prifti Skenduli… - ACM Computing …, 2023 - dl.acm.org
Neural Machine Translation (NMT) has seen tremendous growth in the last ten years since
the early 2000s and has already entered a mature phase. While considered the most widely …
the early 2000s and has already entered a mature phase. While considered the most widely …
True few-shot learning with language models
Pretrained language models (LMs) perform well on many tasks even when learning from a
few examples, but prior work uses many held-out examples to tune various aspects of …
few examples, but prior work uses many held-out examples to tune various aspects of …
Unsupervised speech recognition
Despite rapid progress in the recent past, current speech recognition systems still require
labeled training data which limits this technology to a small fraction of the languages spoken …
labeled training data which limits this technology to a small fraction of the languages spoken …
Contrastive learning for sequential recommendation
Sequential recommendation methods play a crucial role in modern recommender systems
because of their ability to capture a user's dynamic interest from her/his historical inter …
because of their ability to capture a user's dynamic interest from her/his historical inter …
Multilingual denoising pre-training for neural machine translation
This paper demonstrates that multilingual denoising pre-training produces significant
performance gains across a wide variety of machine translation (MT) tasks. We present …
performance gains across a wide variety of machine translation (MT) tasks. We present …
Cross-lingual language model pretraining
Recent studies have demonstrated the efficiency of generative pretraining for English
natural language understanding. In this work, we extend this approach to multiple …
natural language understanding. In this work, we extend this approach to multiple …
Ctrl: A conditional transformer language model for controllable generation
Large-scale language models show promising text generation capabilities, but users cannot
easily control particular aspects of the generated text. We release CTRL, a 1.63 billion …
easily control particular aspects of the generated text. We release CTRL, a 1.63 billion …
Mass: Masked sequence to sequence pre-training for language generation
Pre-training and fine-tuning, eg, BERT, have achieved great success in language
understanding by transferring knowledge from rich-resource pre-training task to the low/zero …
understanding by transferring knowledge from rich-resource pre-training task to the low/zero …
[PDF][PDF] Language models are unsupervised multitask learners
Natural language processing tasks, such as question answering, machine translation,
reading comprehension, and summarization, are typically approached with supervised …
reading comprehension, and summarization, are typically approached with supervised …