A survey of controllable text generation using transformer-based pre-trained language models

H Zhang, H Song, S Li, M Zhou, D Song - ACM Computing Surveys, 2023 - dl.acm.org
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …

Molecular design in drug discovery: a comprehensive review of deep generative models

Y Cheng, Y Gong, Y Liu, B Song… - Briefings in …, 2021 - academic.oup.com
Deep generative models have been an upsurge in the deep learning community since they
were proposed. These models are designed for generating new synthetic data including …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Text2Event: Controllable sequence-to-structure generation for end-to-end event extraction

Y Lu, H Lin, J Xu, X Han, J Tang, A Li, L Sun… - arxiv preprint arxiv …, 2021 - arxiv.org
Event extraction is challenging due to the complex structure of event records and the
semantic gap between text and event. Traditional methods usually extract event records by …

Knowledge graph contrastive learning based on relation-symmetrical structure

K Liang, Y Liu, S Zhou, W Tu, Y Wen… - … on Knowledge and …, 2023 - ieeexplore.ieee.org
Knowledge graph embedding (KGE) aims at learning powerful representations to benefit
various artificial intelligence applications. Meanwhile, contrastive learning has been widely …

Investigating pretrained language models for graph-to-text generation

LFR Ribeiro, M Schmitt, H Schütze… - arxiv preprint arxiv …, 2020 - arxiv.org
Graph-to-text generation aims to generate fluent texts from graph-based data. In this paper,
we investigate two recently proposed pretrained language models (PLMs) and analyze the …

One SPRING to rule them both: Symmetric AMR semantic parsing and generation without a complex pipeline

M Bevilacqua, R Blloshmi, R Navigli - Proceedings of the AAAI …, 2021 - ojs.aaai.org
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines
integrating several different modules or components, and exploit graph recategorization, ie …

Jointgt: Graph-text joint representation learning for text generation from knowledge graphs

P Ke, H Ji, Y Ran, X Cui, L Wang, L Song, X Zhu… - arxiv preprint arxiv …, 2021 - arxiv.org
Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …

A systematic literature review on text generation using deep neural network models

N Fatima, AS Imran, Z Kastrati, SM Daudpota… - IEEE …, 2022 - ieeexplore.ieee.org
In recent years, significant progress has been made in text generation. The latest text
generation models are revolutionizing the domain by generating human-like text. It has …