Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Pre-trained models for natural language processing: A survey
Recently, the emergence of pre-trained models (PTMs) has brought natural language
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
A comprehensive survey on process-oriented automatic text summarization with exploration of llm-based methods
Automatic Text Summarization (ATS), utilizing Natural Language Processing (NLP)
algorithms, aims to create concise and accurate summaries, thereby significantly reducing …
algorithms, aims to create concise and accurate summaries, thereby significantly reducing …
Compositional exemplars for in-context learning
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL)
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
Bartscore: Evaluating generated text as text generation
A wide variety of NLP applications, such as machine translation, summarization, and dialog,
involve text generation. One major challenge for these applications is how to evaluate …
involve text generation. One major challenge for these applications is how to evaluate …
Prefix-tuning: Optimizing continuous prompts for generation
Fine-tuning is the de facto way to leverage large pretrained language models to perform
downstream tasks. However, it modifies all the language model parameters and therefore …
downstream tasks. However, it modifies all the language model parameters and therefore …
QMSum: A new benchmark for query-based multi-domain meeting summarization
Meetings are a key component of human collaboration. As increasing numbers of meetings
are recorded and transcribed, meeting summaries have become essential to remind those …
are recorded and transcribed, meeting summaries have become essential to remind those …
Align and attend: Multimodal summarization with dual contrastive losses
The goal of multimodal summarization is to extract the most important information from
different modalities to form summaries. Unlike unimodal summarization, the multimodal …
different modalities to form summaries. Unlike unimodal summarization, the multimodal …
An improved GNN using dynamic graph embedding mechanism: A novel end-to-end framework for rolling bearing fault diagnosis under variable working conditions
Z Yu, C Zhang, C Deng - Mechanical Systems and Signal Processing, 2023 - Elsevier
Traditional deep learning (DL)-based rolling bearing fault diagnosis methods usually use
signals collected under specific working condition to train the diagnosis models. This may …
signals collected under specific working condition to train the diagnosis models. This may …
Heterogeneous graph neural networks for extractive document summarization
As a crucial step in extractive document summarization, learning cross-sentence relations
has been explored by a plethora of approaches. An intuitive way is to put them in the graph …
has been explored by a plethora of approaches. An intuitive way is to put them in the graph …
Extractive summarization via chatgpt for faithful summary generation
Extractive summarization is a crucial task in natural language processing that aims to
condense long documents into shorter versions by directly extracting sentences. The recent …
condense long documents into shorter versions by directly extracting sentences. The recent …