Multi-document summarization via deep learning techniques: A survey

C Ma, WE Zhang, M Guo, H Wang, QZ Sheng - ACM Computing Surveys, 2022 - dl.acm.org
Multi-document summarization (MDS) is an effective tool for information aggregation that
generates an informative and concise summary from a cluster of topic-related documents …

End-to-end transformer-based models in textual-based NLP

A Rahali, MA Akhloufi - AI, 2023 - mdpi.com
Transformer architectures are highly expressive because they use self-attention
mechanisms to encode long-range dependencies in the input sequences. In this paper, we …

Unlimiformer: Long-range transformers with unlimited length input

A Bertsch, U Alon, G Neubig… - Advances in Neural …, 2024 - proceedings.neurips.cc
Since the proposal of transformers, these models have been limited to bounded input
lengths, because of their need to attend to every token in the input. In this work, we propose …

A survey on long text modeling with transformers

Z Dong, T Tang, L Li, WX Zhao - arxiv preprint arxiv:2302.14502, 2023 - arxiv.org
Modeling long texts has been an essential technique in the field of natural language
processing (NLP). With the ever-growing number of long documents, it is important to …

Summ^ n: A multi-stage summarization framework for long input dialogues and documents

Y Zhang, A Ni, Z Mao, CH Wu, C Zhu, B Deb… - arxiv preprint arxiv …, 2021 - arxiv.org
Text summarization helps readers capture salient information from documents, news,
interviews, and meetings. However, most state-of-the-art pretrained language models (LM) …

DYLE: Dynamic latent extraction for abstractive long-input summarization

Z Mao, CH Wu, A Ni, Y Zhang, R Zhang, T Yu… - arxiv preprint arxiv …, 2021 - arxiv.org
Transformer-based models have achieved state-of-the-art performance on short-input
summarization. However, they still struggle with summarizing longer text. In this paper, we …

Leveraging pretrained models for automatic summarization of doctor-patient conversations

L Zhang, R Negrinho, A Ghosh, V Jagannathan… - arxiv preprint arxiv …, 2021 - arxiv.org
Fine-tuning pretrained models for automatically summarizing doctor-patient conversation
transcripts presents many challenges: limited training data, significant domain shift, long and …

Gretel: Graph contrastive topic enhanced language model for long document extractive summarization

Q **e, J Huang, T Saha, S Ananiadou - arxiv preprint arxiv:2208.09982, 2022 - arxiv.org
Recently, neural topic models (NTMs) have been incorporated into pre-trained language
models (PLMs), to capture the global semantic information for text summarization. However …

Abstractive text summarization: State of the art, challenges, and improvements

H Shakil, A Farooq, J Kalita - Neurocomputing, 2024 - Elsevier
Specifically focusing on the landscape of abstractive text summarization, as opposed to
extractive techniques, this survey presents a comprehensive overview, delving into state-of …

GenCompareSum: a hybrid unsupervised summarization method using salience

J Bishop, Q **e, S Ananiadou - Proceedings of the 21st workshop …, 2022 - aclanthology.org
Text summarization (TS) is an important NLP task. Pre-trained Language Models (PLMs)
have been used to improve the performance of TS. However, PLMs are limited by their need …