Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Deep learning for text style transfer: A survey
Text style transfer is an important task in natural language generation, which aims to control
certain attributes in the generated text, such as politeness, emotion, humor, and many …
certain attributes in the generated text, such as politeness, emotion, humor, and many …
EmpDG: Multiresolution interactive empathetic dialogue generation
A humanized dialogue system is expected to generate empathetic replies, which should be
sensitive to the users' expressed emotion. The task of empathetic dialogue generation is …
sensitive to the users' expressed emotion. The task of empathetic dialogue generation is …
Continual learning for text classification with information disentanglement based regularization
Continual learning has become increasingly important as it enables NLP models to
constantly learn and gain knowledge over time. Previous continual learning methods are …
constantly learn and gain knowledge over time. Previous continual learning methods are …
Exploring controllable text generation techniques
Neural controllable text generation is an important area gaining attention due to its plethora
of applications. Although there is a large body of prior work in controllable text generation …
of applications. Although there is a large body of prior work in controllable text generation …
DP-VAE: Human-readable text anonymization for online reviews with differentially private variational autoencoders
B Weggenmann, V Rublack, M Andrejczuk… - Proceedings of the …, 2022 - dl.acm.org
While vast amounts of personal data are shared daily on public online platforms and used
by companies and analysts to gain valuable insights, privacy concerns are also on the rise …
by companies and analysts to gain valuable insights, privacy concerns are also on the rise …
Generating syntactically controlled paraphrases without using annotated parallel pairs
Paraphrase generation plays an essential role in natural language process (NLP), and it has
many downstream applications. However, training supervised paraphrase models requires …
many downstream applications. However, training supervised paraphrase models requires …
Lottery ticket adaptation: Mitigating destructive interference in llms
Existing methods for adapting large language models (LLMs) to new tasks are not suited to
multi-task adaptation because they modify all the model weights--causing destructive …
multi-task adaptation because they modify all the model weights--causing destructive …
Scalable language model with generalized continual learning
Continual learning has gained increasing importance as it facilitates the acquisition and
refinement of scalable knowledge and skills in language models. However, existing …
refinement of scalable knowledge and skills in language models. However, existing …
Task-agnostic low-rank adapters for unseen English dialects
Large Language Models (LLMs) are trained on corpora disproportionally weighted in favor
of Standard American English. As a result, speakers of other dialects experience …
of Standard American English. As a result, speakers of other dialects experience …
On learning and representing social meaning in NLP: a sociolinguistic perspective
The field of NLP has made substantial progress in building meaning representations.
However, an important aspect of linguistic meaning, social meaning, has been largely …
However, an important aspect of linguistic meaning, social meaning, has been largely …