Survey of the state of the art in natural language generation: Core tasks, applications and evaluation

A Gatt, E Krahmer - Journal of Artificial Intelligence Research, 2018 - jair.org
This paper surveys the current state of the art in Natural Language Generation (NLG),
defined as the task of generating text or speech from non-linguistic input. A survey of NLG is …

Processing social media messages in mass emergency: A survey

M Imran, C Castillo, F Diaz, S Vieweg - ACM Computing Surveys (CSUR …, 2015 - dl.acm.org
Social media platforms provide active communication channels during mass convergence
and emergency events such as disasters caused by natural hazards. As a result, first …

Benchmarking large language models for news summarization

T Zhang, F Ladhak, E Durmus, P Liang… - Transactions of the …, 2024 - direct.mit.edu
Large language models (LLMs) have shown promise for automatic summarization but the
reasons behind their successes are poorly understood. By conducting a human evaluation …

On faithfulness and factuality in abstractive summarization

J Maynez, S Narayan, B Bohnet… - arxiv preprint arxiv …, 2020 - arxiv.org
It is well known that the standard likelihood training and approximate decoding objectives in
neural text generation models lead to less human-like responses for open-ended tasks such …

XL-sum: Large-scale multilingual abstractive summarization for 44 languages

T Hasan, A Bhattacharjee, MS Islam, K Samin… - arxiv preprint arxiv …, 2021 - arxiv.org
Contemporary works on abstractive text summarization have focused primarily on high-
resource languages like English, mostly due to the limited availability of datasets for low/mid …

Calibrating sequence likelihood improves conditional language generation

Y Zhao, M Khalman, R Joshi, S Narayan… - arxiv preprint arxiv …, 2022 - arxiv.org
Conditional language models are predominantly trained with maximum likelihood estimation
(MLE), giving probability mass to sparsely observed target sequences. While MLE trained …

Don't give me the details, just the summary! topic-aware convolutional neural networks for extreme summarization

S Narayan, SB Cohen, M Lapata - arxiv preprint arxiv:1808.08745, 2018 - arxiv.org
We introduce extreme summarization, a new single-document summarization task which
does not favor extractive strategies and calls for an abstractive modeling approach. The idea …

HIBERT: Document level pre-training of hierarchical bidirectional transformers for document summarization

X Zhang, F Wei, M Zhou - arxiv preprint arxiv:1905.06566, 2019 - arxiv.org
Neural extractive summarization models usually employ a hierarchical encoder for
document encoding and they are trained using sentence-level labels, which are created …

Learning spatial-temporal regularized correlation filters for visual tracking

F Li, C Tian, W Zuo, L Zhang… - Proceedings of the …, 2018 - openaccess.thecvf.com
Abstract Discriminative Correlation Filters (DCF) are efficient in visual tracking but suffer from
unwanted boundary effects. Spatially Regularized DCF (SRDCF) has been suggested to …

Ranking sentences for extractive summarization with reinforcement learning

S Narayan, SB Cohen, M Lapata - arxiv preprint arxiv:1802.08636, 2018 - arxiv.org
Single document summarization is the task of producing a shorter version of a document
while preserving its principal information content. In this paper we conceptualize extractive …