Variational autoencoder for deep learning of images, labels and captions
A novel variational autoencoder is developed to model images, as well as associated labels
or captions. The Deep Generative Deconvolutional Network (DGDN) is used as a decoder of …
or captions. The Deep Generative Deconvolutional Network (DGDN) is used as a decoder of …
When dictionary learning meets deep learning: Deep dictionary learning and coding network for image recognition with limited data
We present a new deep dictionary learning and coding network (DDLCN) for image-
recognition tasks with limited data. The proposed DDLCN has most of the standard deep …
recognition tasks with limited data. The proposed DDLCN has most of the standard deep …
Deconvolutional paragraph representation learning
Learning latent representations from long text sequences is an important first step in many
natural language processing applications. Recurrent Neural Networks (RNNs) have become …
natural language processing applications. Recurrent Neural Networks (RNNs) have become …
Adversarial symmetric variational autoencoder
A new form of variational autoencoder (VAE) is developed, in which the joint distribution of
data and codes is considered in two (symmetric) forms:(i) from observed data fed through …
data and codes is considered in two (symmetric) forms:(i) from observed data fed through …
Symmetric variational autoencoder and connections to adversarial learning
A new form of the variational autoencoder (VAE) is proposed, based on the symmetric
Kullback-Leibler divergence. It is demonstrated that learn-ing of the resulting symmetric VAE …
Kullback-Leibler divergence. It is demonstrated that learn-ing of the resulting symmetric VAE …
VAE learning via Stein variational gradient descent
A new method for learning variational autoencoders (VAEs) is developed, based on Stein
variational gradient descent. A key advantage of this approach is that one need not make …
variational gradient descent. A key advantage of this approach is that one need not make …
High-order stochastic gradient thermostats for Bayesian learning of deep models
Learning in deep models using Bayesian methods has generated significant attention
recently. This is largely because of the feasibility of modern Bayesian methods to yield …
recently. This is largely because of the feasibility of modern Bayesian methods to yield …
Tensor-dictionary learning with deep kruskal-factor analysis
A multi-way factor analysis model is introduced for tensor-variate data of any order. Each
data item is represented as a (sparse) sum of Kruskal decompositions, a Kruskal-factor …
data item is represented as a (sparse) sum of Kruskal decompositions, a Kruskal-factor …
Deep micro-dictionary learning and coding network
In this paper, we propose a novel Deep Micro-Dictionary Learning and Coding Network
(DDLCN). DDLCN has most of the standard deep learning layers (pooling, fully, connected …
(DDLCN). DDLCN has most of the standard deep learning layers (pooling, fully, connected …
Towards better representations with deep/Bayesian learning
C Li - 2018 - search.proquest.com
Abstract Deep learning and Bayesian Learning are two popular research topics in machine
learning. They provide the flexible representations in the complementary manner. Therefore …
learning. They provide the flexible representations in the complementary manner. Therefore …