Representation learning: A review and new perspectives

Y Bengio, A Courville, P Vincent - IEEE transactions on pattern …, 2013 - ieeexplore.ieee.org
The success of machine learning algorithms generally depends on data representation, and
we hypothesize that this is because different representations can entangle and hide more or …

[CITACE][C] Unsupervised feature learning and deep learning: A review and new perspectives

Y Bengio, AC Courville, P Vincent - CoRR, abs/1206.5538, 2012 - xmanong.com
The success of machine learning algorithms generally depends on data representation, and
we hypothesize that this is because different representations can entangle and hide more or …

BERT has a mouth, and it must speak: BERT as a Markov random field language model

A Wang, K Cho - arxiv preprint arxiv:1902.04094, 2019 - arxiv.org
We show that BERT (Devlin et al., 2018) is a Markov random field language model. This
formulation gives way to a natural procedure to sample sentences from BERT. We generate …

[KNIHA][B] Deep learning

I Goodfellow, Y Bengio, A Courville, Y Bengio - 2016 - synapse.koreamed.org
Kwang Gi Kim https://doi. org/10.4258/hir. 2016.22. 4.351 ing those who are beginning their
careers in deep learning and artificial intelligence research. The other target audience …

Neural autoregressive distribution estimation

B Uria, MA Côté, K Gregor, I Murray… - Journal of Machine …, 2016 - jmlr.org
We present Neural Autoregressive Distribution Estimation (NADE) models, which are neural
network architectures applied to the problem of unsupervised distribution and density …

Deep learning of representations: Looking forward

Y Bengio - International conference on statistical language and …, 2013 - Springer
Deep learning research aims at discovering learning algorithms that discover multiple levels
of distributed representations, with higher levels representing more abstract concepts …

An introduction to restricted Boltzmann machines

A Fischer, C Igel - Iberoamerican congress on pattern recognition, 2012 - Springer
Abstract Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can
be interpreted as stochastic neural networks. The increase in computational power and the …

Training restricted Boltzmann machines: An introduction

A Fischer, C Igel - Pattern Recognition, 2014 - Elsevier
Abstract Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can
be interpreted as stochastic neural networks. They have attracted much attention as building …

An overview on restricted Boltzmann machines

N Zhang, S Ding, J Zhang, Y Xue - Neurocomputing, 2018 - Elsevier
Abstract The Restricted Boltzmann Machine (RBM) has aroused wide interest in machine
learning fields during the past decade. This review aims to report the recent developments in …

Better mixing via deep representations

Y Bengio, G Mesnil, Y Dauphin… - … conference on machine …, 2013 - proceedings.mlr.press
It has been hypothesized, and supported with experimental evidence, that deeper
representations, when well trained, tend to do a better job at disentangling the underlying …