Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Transformer-patcher: One mistake worth one neuron
Large Transformer-based Pretrained Language Models (PLMs) dominate almost all Natural
Language Processing (NLP) tasks. Nevertheless, they still make mistakes from time to time …
Language Processing (NLP) tasks. Nevertheless, they still make mistakes from time to time …
Facing the elephant in the room: Visual prompt tuning or full finetuning?
As the scale of vision models continues to grow, the emergence of Visual Prompt Tuning
(VPT) as a parameter-efficient transfer learning technique has gained attention due to its …
(VPT) as a parameter-efficient transfer learning technique has gained attention due to its …
What all do audio transformer models hear? probing acoustic representations for language delivery and its structure
In recent times, BERT based transformer models have become an inseparable part of
the'tech stack'of text processing models. Similar progress is being observed in the speech …
the'tech stack'of text processing models. Similar progress is being observed in the speech …
What do audio transformers hear? probing their representations for language delivery & structure
Transformer models across multiple domains such as natural language processing and
speech form an unavoidable part of the tech stack of practitioners and researchers alike. Au …
speech form an unavoidable part of the tech stack of practitioners and researchers alike. Au …
Triviahg: A dataset for automatic hint generation from factoid questions
Nowadays, individuals tend to engage in dialogues with Large Language Models, seeking
answers to their questions. In times when such answers are readily accessible to anyone …
answers to their questions. In times when such answers are readily accessible to anyone …
What does the language system look like in pre-trained language models? A study using complex networks
J Zheng - Knowledge-Based Systems, 2024 - Elsevier
Pre-trained language models has advanced the fields of natural language processing. The
exceptional capabilities exhibited by PLMs in NLP tasks have been attracting researchers to …
exceptional capabilities exhibited by PLMs in NLP tasks have been attracting researchers to …
Emotion AWARE: an artificial intelligence framework for adaptable, robust, explainable, and multi-granular emotion analysis
Emotions are fundamental to human behaviour. How we feel, individually and collectively,
determines how humanity evolves and advances into our shared future. The rapid …
determines how humanity evolves and advances into our shared future. The rapid …
Visual explanation for open-domain question answering with bert
Open-domain question answering (OpenQA) is an essential but challenging task in natural
language processing that aims to answer questions in natural language formats on the basis …
language processing that aims to answer questions in natural language formats on the basis …
What does BERT learn about prosody?
Language models have become nearly ubiquitous in natural language processing
applications achieving state-of-the-art results in many tasks including prosody. As the model …
applications achieving state-of-the-art results in many tasks including prosody. As the model …
NLRG at SemEval-2021 task 5: Toxic spans detection leveraging BERT-based token classification and span prediction techniques
Toxicity detection of text has been a popular NLP task in the recent years. In SemEval-2021
Task-5 Toxic Spans Detection, the focus is on detecting toxic spans within passages. Most …
Task-5 Toxic Spans Detection, the focus is on detecting toxic spans within passages. Most …