Survey of the state of the art in natural language generation: Core tasks, applications and evaluation
This paper surveys the current state of the art in Natural Language Generation (NLG),
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …
Processing social media messages in mass emergency: A survey
Social media platforms provide active communication channels during mass convergence
and emergency events such as disasters caused by natural hazards. As a result, first …
and emergency events such as disasters caused by natural hazards. As a result, first …
Benchmarking large language models for news summarization
Large language models (LLMs) have shown promise for automatic summarization but the
reasons behind their successes are poorly understood. By conducting a human evaluation …
reasons behind their successes are poorly understood. By conducting a human evaluation …
On faithfulness and factuality in abstractive summarization
It is well known that the standard likelihood training and approximate decoding objectives in
neural text generation models lead to less human-like responses for open-ended tasks such …
neural text generation models lead to less human-like responses for open-ended tasks such …
XL-sum: Large-scale multilingual abstractive summarization for 44 languages
Contemporary works on abstractive text summarization have focused primarily on high-
resource languages like English, mostly due to the limited availability of datasets for low/mid …
resource languages like English, mostly due to the limited availability of datasets for low/mid …
Calibrating sequence likelihood improves conditional language generation
Conditional language models are predominantly trained with maximum likelihood estimation
(MLE), giving probability mass to sparsely observed target sequences. While MLE trained …
(MLE), giving probability mass to sparsely observed target sequences. While MLE trained …
Don't give me the details, just the summary! topic-aware convolutional neural networks for extreme summarization
We introduce extreme summarization, a new single-document summarization task which
does not favor extractive strategies and calls for an abstractive modeling approach. The idea …
does not favor extractive strategies and calls for an abstractive modeling approach. The idea …
HIBERT: Document level pre-training of hierarchical bidirectional transformers for document summarization
Neural extractive summarization models usually employ a hierarchical encoder for
document encoding and they are trained using sentence-level labels, which are created …
document encoding and they are trained using sentence-level labels, which are created …
Learning spatial-temporal regularized correlation filters for visual tracking
Abstract Discriminative Correlation Filters (DCF) are efficient in visual tracking but suffer from
unwanted boundary effects. Spatially Regularized DCF (SRDCF) has been suggested to …
unwanted boundary effects. Spatially Regularized DCF (SRDCF) has been suggested to …
Ranking sentences for extractive summarization with reinforcement learning
Single document summarization is the task of producing a shorter version of a document
while preserving its principal information content. In this paper we conceptualize extractive …
while preserving its principal information content. In this paper we conceptualize extractive …