Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey of controllable text generation using transformer-based pre-trained language models
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …
generation (NLG). It is regarded as crucial for the development of advanced text generation …
Molecular design in drug discovery: a comprehensive review of deep generative models
Deep generative models have been an upsurge in the deep learning community since they
were proposed. These models are designed for generating new synthetic data including …
were proposed. These models are designed for generating new synthetic data including …
Graph neural networks: foundation, frontiers and applications
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …
recent years. Graph neural networks, also known as deep learning on graphs, graph …
Graph neural networks for natural language processing: A survey
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Text2Event: Controllable sequence-to-structure generation for end-to-end event extraction
Event extraction is challenging due to the complex structure of event records and the
semantic gap between text and event. Traditional methods usually extract event records by …
semantic gap between text and event. Traditional methods usually extract event records by …
Knowledge graph contrastive learning based on relation-symmetrical structure
Knowledge graph embedding (KGE) aims at learning powerful representations to benefit
various artificial intelligence applications. Meanwhile, contrastive learning has been widely …
various artificial intelligence applications. Meanwhile, contrastive learning has been widely …
Investigating pretrained language models for graph-to-text generation
Graph-to-text generation aims to generate fluent texts from graph-based data. In this paper,
we investigate two recently proposed pretrained language models (PLMs) and analyze the …
we investigate two recently proposed pretrained language models (PLMs) and analyze the …
One SPRING to rule them both: Symmetric AMR semantic parsing and generation without a complex pipeline
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines
integrating several different modules or components, and exploit graph recategorization, ie …
integrating several different modules or components, and exploit graph recategorization, ie …
Jointgt: Graph-text joint representation learning for text generation from knowledge graphs
Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …
A systematic literature review on text generation using deep neural network models
In recent years, significant progress has been made in text generation. The latest text
generation models are revolutionizing the domain by generating human-like text. It has …
generation models are revolutionizing the domain by generating human-like text. It has …