A survey of natural language generation

C Dong, Y Li, H Gong, M Chen, J Li, Y Shen… - ACM Computing …, 2022‏ - dl.acm.org
This article offers a comprehensive review of the research on Natural Language Generation
(NLG) over the past two decades, especially in relation to data-to-text generation and text-to …

Deep learning in the fast lane: A survey on advanced intrusion detection systems for intelligent vehicle networks

M Almehdhar, A Albaseer, MA Khan… - IEEE Open Journal …, 2024‏ - ieeexplore.ieee.org
The rapid evolution of modern automobiles into intelligent and interconnected entities
presents new challenges in cybersecurity, particularly in Intrusion Detection Systems (IDS) …

Graph contrastive learning with adaptive augmentation

Y Zhu, Y Xu, F Yu, Q Liu, S Wu, L Wang - Proceedings of the web …, 2021‏ - dl.acm.org
Recently, contrastive learning (CL) has emerged as a successful method for unsupervised
graph representation learning. Most graph CL methods first perform stochastic augmentation …

Adversarial attack and defense technologies in natural language processing: A survey

S Qiu, Q Liu, S Zhou, W Huang - Neurocomputing, 2022‏ - Elsevier
Recently, the adversarial attack and defense technology has made remarkable
achievements and has been widely applied in the computer vision field, promoting its rapid …

Optimus: Organizing sentences via pre-trained modeling of a latent space

C Li, X Gao, Y Li, B Peng, X Li, Y Zhang… - arxiv preprint arxiv …, 2020‏ - arxiv.org
When trained effectively, the Variational Autoencoder (VAE) can be both a powerful
generative model and an effective representation learning framework for natural language …

A review of text style transfer using deep learning

M Toshevska, S Gievska - IEEE Transactions on Artificial …, 2021‏ - ieeexplore.ieee.org
Style is an integral component of a sentence indicated by the choice of words a person
makes. Different people have different ways of expressing themselves; however, they adjust …

Plug-and-blend: a framework for plug-and-play controllable story generation with sketches

Z Lin, MO Riedl - Proceedings of the AAAI Conference on Artificial …, 2021‏ - ojs.aaai.org
Large pre-trained neural language models (LM) have very powerful text generation
capabilities. However, in practice, they are hard to control for creative purposes. We …

Composable text controls in latent space with ODEs

G Liu, Z Feng, Y Gao, Z Yang, X Liang, J Bao… - arxiv preprint arxiv …, 2022‏ - arxiv.org
Real-world text applications often involve composing a wide range of text control operations,
such as editing the text wrt an attribute, manipulating keywords and structure, and …

Leashing the inner demons: Self-detoxification for language models

C Xu, Z He, Z He, J McAuley - Proceedings of the AAAI Conference on …, 2022‏ - ojs.aaai.org
Abstract Language models (LMs) can reproduce (or amplify) toxic language seen during
training, which poses a risk to their practical application. In this paper, we conduct extensive …

Detect and perturb: Neutral rewriting of biased and sensitive text via gradient-based decoding

Z He, BP Majumder, J McAuley - arxiv preprint arxiv:2109.11708, 2021‏ - arxiv.org
Written language carries explicit and implicit biases that can distract from meaningful
signals. For example, letters of reference may describe male and female candidates …