Kilm: Knowledge injection into encoder-decoder language models

Y Xu, M Namazifar, D Hazarika, A Padmakumar… - arxiv preprint arxiv …, 2023 - arxiv.org
Large pre-trained language models (PLMs) have been shown to retain implicit knowledge
within their parameters. To enhance this implicit knowledge, we propose Knowledge …

FRUIT: Faithfully reflecting updated information in text

RL Logan IV, A Passos, S Singh, MW Chang - arxiv preprint arxiv …, 2021 - arxiv.org
Textual knowledge bases such as Wikipedia require considerable effort to keep up to date
and consistent. While automated writing assistants could potentially ease this burden, the …

Elaborative simplification: Content addition and explanation generation in text simplification

N Srikanth, JJ Li - arxiv preprint arxiv:2010.10035, 2020 - arxiv.org
Much of modern-day text simplification research focuses on sentence-level simplification,
transforming original, more complex sentences into simplified versions. However, adding …

WikiTableT: A large-scale data-to-text dataset for generating Wikipedia article sections

M Chen, S Wiseman, K Gimpel - arxiv preprint arxiv:2012.14919, 2020 - arxiv.org
Datasets for data-to-text generation typically focus either on multi-domain, single-sentence
generation or on single-domain, long-form generation. In this work, we cast generating …

Creating custom event data without dictionaries: A bag-of-tricks

A Halterman, PA Schrodt, A Beger, BE Bagozzi… - arxiv preprint arxiv …, 2023 - arxiv.org
Event data, or structured records of``who did what to whom''that are automatically extracted
from text, is an important source of data for scholars of international politics. The high cost of …

Factual or contextual? disentangling error types in entity description generation

N Goyal, A Nenkova, H Daumé III - … of the 61st Annual Meeting of …, 2023 - aclanthology.org
In the task of entity description generation, given a context and a specified entity, a model
must describe that entity correctly and in a contextually-relevant way. In this task, as well as …

IGA: An intent-guided authoring assistant

S Sun, W Zhao, V Manjunatha, R Jain… - arxiv preprint arxiv …, 2021 - arxiv.org
While large-scale pretrained language models have significantly improved writing
assistance functionalities such as autocomplete, more complex and controllable writing …

Characterizing collective attention via descriptor context: A case study of public discussions of crisis events

I Stewart, D Yang, J Eisenstein - … of the International AAAI Conference on …, 2020 - ojs.aaai.org
Social media datasets make it possible to rapidly quantify collective attention to emerging
topics and breaking news, such as crisis events. Collective attention is typically measured by …

[KNJIGA][B] Incorporating and Eliciting Knowledge in Neural Language Models

RL Logan - 2022 - search.proquest.com
Neural language models have drastically changed the landscape of natural language
processing (NLP). Originally used for language generation (eg, in summarization and …

Leveraging natural supervision for language representation learning and generation

M Chen - arxiv preprint arxiv:2207.10617, 2022 - arxiv.org
Recent breakthroughs in Natural Language Processing (NLP) have been driven by
language models trained on a massive amount of plain text. While powerful, deriving …