Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Kilm: Knowledge injection into encoder-decoder language models
Large pre-trained language models (PLMs) have been shown to retain implicit knowledge
within their parameters. To enhance this implicit knowledge, we propose Knowledge …
within their parameters. To enhance this implicit knowledge, we propose Knowledge …
FRUIT: Faithfully reflecting updated information in text
Textual knowledge bases such as Wikipedia require considerable effort to keep up to date
and consistent. While automated writing assistants could potentially ease this burden, the …
and consistent. While automated writing assistants could potentially ease this burden, the …
Elaborative simplification: Content addition and explanation generation in text simplification
Much of modern-day text simplification research focuses on sentence-level simplification,
transforming original, more complex sentences into simplified versions. However, adding …
transforming original, more complex sentences into simplified versions. However, adding …
WikiTableT: A large-scale data-to-text dataset for generating Wikipedia article sections
Datasets for data-to-text generation typically focus either on multi-domain, single-sentence
generation or on single-domain, long-form generation. In this work, we cast generating …
generation or on single-domain, long-form generation. In this work, we cast generating …
Creating custom event data without dictionaries: A bag-of-tricks
Event data, or structured records of``who did what to whom''that are automatically extracted
from text, is an important source of data for scholars of international politics. The high cost of …
from text, is an important source of data for scholars of international politics. The high cost of …
Factual or contextual? disentangling error types in entity description generation
In the task of entity description generation, given a context and a specified entity, a model
must describe that entity correctly and in a contextually-relevant way. In this task, as well as …
must describe that entity correctly and in a contextually-relevant way. In this task, as well as …
IGA: An intent-guided authoring assistant
While large-scale pretrained language models have significantly improved writing
assistance functionalities such as autocomplete, more complex and controllable writing …
assistance functionalities such as autocomplete, more complex and controllable writing …
Characterizing collective attention via descriptor context: A case study of public discussions of crisis events
Social media datasets make it possible to rapidly quantify collective attention to emerging
topics and breaking news, such as crisis events. Collective attention is typically measured by …
topics and breaking news, such as crisis events. Collective attention is typically measured by …
[KNJIGA][B] Incorporating and Eliciting Knowledge in Neural Language Models
RL Logan - 2022 - search.proquest.com
Neural language models have drastically changed the landscape of natural language
processing (NLP). Originally used for language generation (eg, in summarization and …
processing (NLP). Originally used for language generation (eg, in summarization and …
Leveraging natural supervision for language representation learning and generation
M Chen - arxiv preprint arxiv:2207.10617, 2022 - arxiv.org
Recent breakthroughs in Natural Language Processing (NLP) have been driven by
language models trained on a massive amount of plain text. While powerful, deriving …
language models trained on a massive amount of plain text. While powerful, deriving …