Survey of the state of the art in natural language generation: Core tasks, applications and evaluation

A Gatt, E Krahmer - Journal of Artificial Intelligence Research, 2018 - jair.org
This paper surveys the current state of the art in Natural Language Generation (NLG),
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …

Decoding methods in neural language generation: a survey

S Zarrieß, H Voigt, S Schüz - Information, 2021 - mdpi.com
Neural encoder-decoder models for language generation can be trained to predict words
directly from linguistic or non-linguistic inputs. When generating with these so-called end-to …

Data-to-text generation with content selection and planning

R Puduppully, L Dong, M Lapata - Proceedings of the AAAI conference on …, 2019 - aaai.org
Recent advances in data-to-text generation have led to the use of large-scale datasets and
neural network models which are trained end-to-end, without explicitly modeling what to say …

Dart: Open-domain structured data record to text generation

L Nan, D Radev, R Zhang, A Rau, A Sivaprasad… - arxiv preprint arxiv …, 2020 - arxiv.org
We present DART, an open domain structured DAta Record to Text generation dataset with
over 82k instances (DARTs). Data-to-Text annotations can be a costly process, especially …

Table-to-text generation by structure-aware seq2seq learning

T Liu, K Wang, L Sha, B Chang, Z Sui - Proceedings of the AAAI …, 2018 - ojs.aaai.org
Table-to-text generation aims to generate a description for a factual table which can be
viewed as a set of field-value records. To encode both the content and the structure of a …

Step-by-step: Separating planning from realization in neural data-to-text generation

A Moryossef, Y Goldberg, I Dagan - arxiv preprint arxiv:1904.03396, 2019 - arxiv.org
Data-to-text generation can be conceptually divided into two parts: ordering and structuring
the information (planning), and generating fluent language describing the information …

What to talk about and how? selective generation using lstms with coarse-to-fine alignment

H Mei, M Bansal, MR Walter - arxiv preprint arxiv:1509.00838, 2015 - arxiv.org
We propose an end-to-end, domain-independent neural encoder-aligner-decoder model for
selective generation, ie, the joint task of content selection and surface realization. Our model …

Few-shot NLG with pre-trained language model

Z Chen, H Eavani, W Chen, Y Liu, WY Wang - arxiv preprint arxiv …, 2019 - arxiv.org
Neural-based end-to-end approaches to natural language generation (NLG) from structured
data or knowledge are data-hungry, making their adoption for real-world applications difficult …

Densely connected graph convolutional networks for graph-to-sequence learning

Z Guo, Y Zhang, Z Teng, W Lu - Transactions of the Association for …, 2019 - direct.mit.edu
We focus on graph-to-sequence learning, which can be framed as transducing graph
structures to sequences for text generation. To capture structural information associated with …

Introducing hypergraph signal processing: Theoretical foundation and practical applications

S Zhang, Z Ding, S Cui - IEEE Internet of Things Journal, 2019 - ieeexplore.ieee.org
Signal processing over graphs has recently attracted significant attention for dealing with the
structured data. Normal graphs, however, only model pairwise relationships between nodes …