Joint embedding of words and labels for text classification

G Wang, C Li, W Wang, Y Zhang, D Shen… - arxiv preprint arxiv …, 2018 - arxiv.org
Word embeddings are effective intermediate representations for capturing semantic
regularities between words, when learning the representations of text sequences. We …

Topic modelling meets deep neural networks: A survey

H Zhao, D Phung, V Huynh, Y **, L Du… - arxiv preprint arxiv …, 2021 - arxiv.org
Topic modelling has been a successful technique for text analysis for almost twenty years.
When topic modelling met deep neural networks, there emerged a new and increasingly …

Adversarially regularized autoencoders

J Zhao, Y Kim, K Zhang, A Rush… - … conference on machine …, 2018 - proceedings.mlr.press
Deep latent variable models, trained using variational autoencoders or generative
adversarial networks, are now a key technique for representation learning of continuous …

Semi-amortized variational autoencoders

Y Kim, S Wiseman, A Miller… - … on Machine Learning, 2018 - proceedings.mlr.press
Amortized variational inference (AVI) replaces instance-specific local inference with a global
inference network. While AVI has enabled efficient training of deep generative models such …

Natural language generation using deep learning to support MOOC learners

C Li, W **ng - International Journal of Artificial Intelligence in …, 2021 - Springer
Among all the learning resources within MOOCs such as video lectures and homework, the
discussion forum stood out as a valuable platform for students' learning through knowledge …

Wide compression: Tensor ring nets

W Wang, Y Sun, B Eriksson… - Proceedings of the …, 2018 - openaccess.thecvf.com
Deep neural networks have demonstrated state-of-the-art performance in a variety of real-
world applications. In order to obtain performance gains, these networks have grown larger …

A tutorial on deep latent variable models of natural language

Y Kim, S Wiseman, AM Rush - arxiv preprint arxiv:1812.06834, 2018 - arxiv.org
There has been much recent, exciting work on combining the complementary strengths of
latent variable models and deep learning. Latent variable modeling makes it easy to …

Topic-guided variational autoencoders for text generation

W Wang, Z Gan, H Xu, R Zhang, G Wang… - arxiv preprint arxiv …, 2019 - arxiv.org
We propose a topic-guided variational autoencoder (TGVAE) model for text generation.
Distinct from existing variational autoencoder (VAE) based approaches, which assume a …

Label confusion learning to enhance text classification models

B Guo, S Han, X Han, H Huang, T Lu - Proceedings of the AAAI …, 2021 - ojs.aaai.org
Representing the true label as one-hot vector is the common practice in training text
classification models. However, the one-hot representation may not adequately reflect the …

Multimodality information fusion for automated machine translation

L Li, T Tayir, Y Han, X Tao, JD Velásquez - Information Fusion, 2023 - Elsevier
Abstract Machine translation is a popular automation approach for translating texts between
different languages. Although traditionally it has a strong focus on natural language, images …