Variational autoencoder for deep learning of images, labels and captions

Y Pu, Z Gan, R Henao, X Yuan, C Li… - Advances in neural …, 2016 - proceedings.neurips.cc
A novel variational autoencoder is developed to model images, as well as associated labels
or captions. The Deep Generative Deconvolutional Network (DGDN) is used as a decoder of …

When dictionary learning meets deep learning: Deep dictionary learning and coding network for image recognition with limited data

H Tang, H Liu, W **ao, N Sebe - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
We present a new deep dictionary learning and coding network (DDLCN) for image-
recognition tasks with limited data. The proposed DDLCN has most of the standard deep …

Deconvolutional paragraph representation learning

Y Zhang, D Shen, G Wang, Z Gan… - Advances in Neural …, 2017 - proceedings.neurips.cc
Learning latent representations from long text sequences is an important first step in many
natural language processing applications. Recurrent Neural Networks (RNNs) have become …

Adversarial symmetric variational autoencoder

Y Pu, W Wang, R Henao, L Chen… - Advances in neural …, 2017 - proceedings.neurips.cc
A new form of variational autoencoder (VAE) is developed, in which the joint distribution of
data and codes is considered in two (symmetric) forms:(i) from observed data fed through …

Symmetric variational autoencoder and connections to adversarial learning

L Chen, S Dai, Y Pu, E Zhou, C Li… - International …, 2018 - proceedings.mlr.press
A new form of the variational autoencoder (VAE) is proposed, based on the symmetric
Kullback-Leibler divergence. It is demonstrated that learn-ing of the resulting symmetric VAE …

VAE learning via Stein variational gradient descent

Y Pu, Z Gan, R Henao, C Li, S Han… - Advances in Neural …, 2017 - proceedings.neurips.cc
A new method for learning variational autoencoders (VAEs) is developed, based on Stein
variational gradient descent. A key advantage of this approach is that one need not make …

High-order stochastic gradient thermostats for Bayesian learning of deep models

C Li, C Chen, K Fan, L Carin - Proceedings of the AAAI Conference on …, 2016 - ojs.aaai.org
Learning in deep models using Bayesian methods has generated significant attention
recently. This is largely because of the feasibility of modern Bayesian methods to yield …

Tensor-dictionary learning with deep kruskal-factor analysis

A Stevens, Y Pu, Y Sun, G Spell… - Artificial Intelligence …, 2017 - proceedings.mlr.press
A multi-way factor analysis model is introduced for tensor-variate data of any order. Each
data item is represented as a (sparse) sum of Kruskal decompositions, a Kruskal-factor …

Deep micro-dictionary learning and coding network

H Tang, H Wei, W **ao, W Wang, D Xu… - 2019 IEEE Winter …, 2019 - ieeexplore.ieee.org
In this paper, we propose a novel Deep Micro-Dictionary Learning and Coding Network
(DDLCN). DDLCN has most of the standard deep learning layers (pooling, fully, connected …

Towards better representations with deep/Bayesian learning

C Li - 2018 - search.proquest.com
Abstract Deep learning and Bayesian Learning are two popular research topics in machine
learning. They provide the flexible representations in the complementary manner. Therefore …