A survey of the usages of deep learning for natural language processing

DW Otter, JR Medina, JK Kalita - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
Over the last several years, the field of natural language processing has been propelled
forward by an explosion in the use of deep learning models. This article provides a brief …

A study of generative large language model for medical research and healthcare

C Peng, X Yang, A Chen, KE Smith… - NPJ digital …, 2023 - nature.com
There are enormous enthusiasm and concerns in applying large language models (LLMs) to
healthcare. Yet current assumptions are based on general-purpose LLMs such as ChatGPT …

Creativity and machine learning: A survey

G Franceschelli, M Musolesi - ACM Computing Surveys, 2024 - dl.acm.org
There is a growing interest in the area of machine learning and creativity. This survey
presents an overview of the history and the state of the art of computational creativity …

A knowledge-enhanced pretraining model for commonsense story generation

J Guan, F Huang, Z Zhao, X Zhu… - Transactions of the …, 2020 - direct.mit.edu
Story generation, namely, generating a reasonable story from a leading context, is an
important but challenging task. In spite of the success in modeling fluency and local …

Enabling language models to fill in the blanks

C Donahue, M Lee, P Liang - arxiv preprint arxiv:2005.05339, 2020 - arxiv.org
We present a simple approach for text infilling, the task of predicting missing spans of text at
any position in a document. While infilling could enable rich functionality especially for …

The perils of using Mechanical Turk to evaluate open-ended text generation

M Karpinska, N Akoury, M Iyyer - arxiv preprint arxiv:2109.06835, 2021 - arxiv.org
Recent text generation research has increasingly focused on open-ended domains such as
story and poetry generation. Because models built for such tasks are difficult to evaluate …

Strategies for structuring story generation

A Fan, M Lewis, Y Dauphin - arxiv preprint arxiv:1902.01109, 2019 - arxiv.org
Writers generally rely on plans or sketches to write long stories, but most current language
models generate word by word from left to right. We explore coarse-to-fine models for …

Story ending generation with incremental encoding and commonsense knowledge

J Guan, Y Wang, M Huang - Proceedings of the AAAI Conference on …, 2019 - aaai.org
Generating a reasonable ending for a given story context, ie, story ending generation, is a
strong indication of story comprehension. This task requires not only to understand the …

Data-to-text generation with entity modeling

R Puduppully, L Dong, M Lapata - arxiv preprint arxiv:1906.03221, 2019 - arxiv.org
Recent approaches to data-to-text generation have shown great promise thanks to the use
of large-scale datasets and the application of neural network architectures which are trained …

Relational memory-augmented language models

Q Liu, D Yogatama, P Blunsom - Transactions of the Association for …, 2022 - direct.mit.edu
We present a memory-augmented approach to condition an autoregressive language model
on a knowledge graph. We represent the graph as a collection of relation triples and retrieve …