A survey of the usages of deep learning for natural language processing
Over the last several years, the field of natural language processing has been propelled
forward by an explosion in the use of deep learning models. This article provides a brief …
forward by an explosion in the use of deep learning models. This article provides a brief …
A study of generative large language model for medical research and healthcare
There are enormous enthusiasm and concerns in applying large language models (LLMs) to
healthcare. Yet current assumptions are based on general-purpose LLMs such as ChatGPT …
healthcare. Yet current assumptions are based on general-purpose LLMs such as ChatGPT …
Creativity and machine learning: A survey
There is a growing interest in the area of machine learning and creativity. This survey
presents an overview of the history and the state of the art of computational creativity …
presents an overview of the history and the state of the art of computational creativity …
A knowledge-enhanced pretraining model for commonsense story generation
Story generation, namely, generating a reasonable story from a leading context, is an
important but challenging task. In spite of the success in modeling fluency and local …
important but challenging task. In spite of the success in modeling fluency and local …
Enabling language models to fill in the blanks
We present a simple approach for text infilling, the task of predicting missing spans of text at
any position in a document. While infilling could enable rich functionality especially for …
any position in a document. While infilling could enable rich functionality especially for …
The perils of using Mechanical Turk to evaluate open-ended text generation
Recent text generation research has increasingly focused on open-ended domains such as
story and poetry generation. Because models built for such tasks are difficult to evaluate …
story and poetry generation. Because models built for such tasks are difficult to evaluate …
Strategies for structuring story generation
Writers generally rely on plans or sketches to write long stories, but most current language
models generate word by word from left to right. We explore coarse-to-fine models for …
models generate word by word from left to right. We explore coarse-to-fine models for …
Story ending generation with incremental encoding and commonsense knowledge
Generating a reasonable ending for a given story context, ie, story ending generation, is a
strong indication of story comprehension. This task requires not only to understand the …
strong indication of story comprehension. This task requires not only to understand the …
Data-to-text generation with entity modeling
Recent approaches to data-to-text generation have shown great promise thanks to the use
of large-scale datasets and the application of neural network architectures which are trained …
of large-scale datasets and the application of neural network architectures which are trained …
Relational memory-augmented language models
We present a memory-augmented approach to condition an autoregressive language model
on a knowledge graph. We represent the graph as a collection of relation triples and retrieve …
on a knowledge graph. We represent the graph as a collection of relation triples and retrieve …