Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey of knowledge enhanced pre-trained language models
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …
supervised learning method, have yielded promising performance on various tasks in …
A survey of knowledge-enhanced text generation
The goal of text-to-text generation is to make machines express like a human in many
applications such as conversation, summarization, and translation. It is one of the most …
applications such as conversation, summarization, and translation. It is one of the most …
Unifying large language models and knowledge graphs: A roadmap
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …
field of natural language processing and artificial intelligence, due to their emergent ability …
K-lite: Learning transferable visual models with external knowledge
The new generation of state-of-the-art computer vision systems are trained from natural
language supervision, ranging from simple object category names to descriptive captions …
language supervision, ranging from simple object category names to descriptive captions …
Jaket: Joint pre-training of knowledge graph and language understanding
Abstract Knowledge graphs (KGs) contain rich information about world knowledge, entities,
and relations. Thus, they can be great supplements to existing pre-trained language models …
and relations. Thus, they can be great supplements to existing pre-trained language models …
A survey of multi-task learning in natural language processing: Regarding task relatedness and training methods
Multi-task learning (MTL) has become increasingly popular in natural language processing
(NLP) because it improves the performance of related tasks by exploiting their …
(NLP) because it improves the performance of related tasks by exploiting their …
Learning customized visual models with retrieval-augmented knowledge
Image-text contrastive learning models such as CLIP have demonstrated strong task transfer
ability. The high generality and usability of these visual models is achieved via a web-scale …
ability. The high generality and usability of these visual models is achieved via a web-scale …
Adaprompt: Adaptive model training for prompt-based nlp
Prompt-based learning, with its capability to tackle zero-shot and few-shot NLP tasks, has
gained much attention in community. The main idea is to bridge the gap between NLP …
gained much attention in community. The main idea is to bridge the gap between NLP …
Knowledge-augmented methods for natural language processing
Knowledge in NLP has been a rising trend especially after the advent of large-scale pre-
trained models. Knowledge is critical to equip statistics-based models with common sense …
trained models. Knowledge is critical to equip statistics-based models with common sense …
Diversifying content generation for commonsense reasoning with mixture of knowledge graph experts
Generative commonsense reasoning (GCR) in natural language is to reason about the
commonsense while generating coherent text. Recent years have seen a surge of interest in …
commonsense while generating coherent text. Recent years have seen a surge of interest in …