Pre-trained models for natural language processing: A survey

X Qiu, T Sun, Y Xu, Y Shao, N Dai, X Huang - Science China …, 2020‏ - Springer
Recently, the emergence of pre-trained models (PTMs) has brought natural language
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …

A comprehensive survey on process-oriented automatic text summarization with exploration of llm-based methods

H **, Y Zhang, D Meng, J Wang, J Tan - arxiv preprint arxiv:2403.02901, 2024‏ - arxiv.org
Automatic Text Summarization (ATS), utilizing Natural Language Processing (NLP)
algorithms, aims to create concise and accurate summaries, thereby significantly reducing …

Compositional exemplars for in-context learning

J Ye, Z Wu, J Feng, T Yu… - … Conference on Machine …, 2023‏ - proceedings.mlr.press
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL)
ability, where the model learns to do an unseen task simply by conditioning on a prompt …

Bartscore: Evaluating generated text as text generation

W Yuan, G Neubig, P Liu - Advances in neural information …, 2021‏ - proceedings.neurips.cc
A wide variety of NLP applications, such as machine translation, summarization, and dialog,
involve text generation. One major challenge for these applications is how to evaluate …

Prefix-tuning: Optimizing continuous prompts for generation

XL Li, P Liang - arxiv preprint arxiv:2101.00190, 2021‏ - arxiv.org
Fine-tuning is the de facto way to leverage large pretrained language models to perform
downstream tasks. However, it modifies all the language model parameters and therefore …

QMSum: A new benchmark for query-based multi-domain meeting summarization

M Zhong, D Yin, T Yu, A Zaidi, M Mutuma, R Jha… - arxiv preprint arxiv …, 2021‏ - arxiv.org
Meetings are a key component of human collaboration. As increasing numbers of meetings
are recorded and transcribed, meeting summaries have become essential to remind those …

Align and attend: Multimodal summarization with dual contrastive losses

B He, J Wang, J Qiu, T Bui… - Proceedings of the …, 2023‏ - openaccess.thecvf.com
The goal of multimodal summarization is to extract the most important information from
different modalities to form summaries. Unlike unimodal summarization, the multimodal …

An improved GNN using dynamic graph embedding mechanism: A novel end-to-end framework for rolling bearing fault diagnosis under variable working conditions

Z Yu, C Zhang, C Deng - Mechanical Systems and Signal Processing, 2023‏ - Elsevier
Traditional deep learning (DL)-based rolling bearing fault diagnosis methods usually use
signals collected under specific working condition to train the diagnosis models. This may …

Heterogeneous graph neural networks for extractive document summarization

D Wang, P Liu, Y Zheng, X Qiu, X Huang - arxiv preprint arxiv:2004.12393, 2020‏ - arxiv.org
As a crucial step in extractive document summarization, learning cross-sentence relations
has been explored by a plethora of approaches. An intuitive way is to put them in the graph …

Extractive summarization via chatgpt for faithful summary generation

H Zhang, X Liu, J Zhang - arxiv preprint arxiv:2304.04193, 2023‏ - arxiv.org
Extractive summarization is a crucial task in natural language processing that aims to
condense long documents into shorter versions by directly extracting sentences. The recent …