Survey of the state of the art in natural language generation: Core tasks, applications and evaluation
This paper surveys the current state of the art in Natural Language Generation (NLG),
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …
Decoding methods in neural language generation: a survey
Neural encoder-decoder models for language generation can be trained to predict words
directly from linguistic or non-linguistic inputs. When generating with these so-called end-to …
directly from linguistic or non-linguistic inputs. When generating with these so-called end-to …
Data-to-text generation with content selection and planning
Recent advances in data-to-text generation have led to the use of large-scale datasets and
neural network models which are trained end-to-end, without explicitly modeling what to say …
neural network models which are trained end-to-end, without explicitly modeling what to say …
Dart: Open-domain structured data record to text generation
We present DART, an open domain structured DAta Record to Text generation dataset with
over 82k instances (DARTs). Data-to-Text annotations can be a costly process, especially …
over 82k instances (DARTs). Data-to-Text annotations can be a costly process, especially …
Table-to-text generation by structure-aware seq2seq learning
Table-to-text generation aims to generate a description for a factual table which can be
viewed as a set of field-value records. To encode both the content and the structure of a …
viewed as a set of field-value records. To encode both the content and the structure of a …
Step-by-step: Separating planning from realization in neural data-to-text generation
Data-to-text generation can be conceptually divided into two parts: ordering and structuring
the information (planning), and generating fluent language describing the information …
the information (planning), and generating fluent language describing the information …
What to talk about and how? selective generation using lstms with coarse-to-fine alignment
We propose an end-to-end, domain-independent neural encoder-aligner-decoder model for
selective generation, ie, the joint task of content selection and surface realization. Our model …
selective generation, ie, the joint task of content selection and surface realization. Our model …
Few-shot NLG with pre-trained language model
Neural-based end-to-end approaches to natural language generation (NLG) from structured
data or knowledge are data-hungry, making their adoption for real-world applications difficult …
data or knowledge are data-hungry, making their adoption for real-world applications difficult …
Densely connected graph convolutional networks for graph-to-sequence learning
We focus on graph-to-sequence learning, which can be framed as transducing graph
structures to sequences for text generation. To capture structural information associated with …
structures to sequences for text generation. To capture structural information associated with …
Introducing hypergraph signal processing: Theoretical foundation and practical applications
Signal processing over graphs has recently attracted significant attention for dealing with the
structured data. Normal graphs, however, only model pairwise relationships between nodes …
structured data. Normal graphs, however, only model pairwise relationships between nodes …