Energy-based out-of-distribution detection

W Liu, X Wang, J Owens, Y Li - Advances in neural …, 2020 - proceedings.neurips.cc
Determining whether inputs are out-of-distribution (OOD) is an essential building block for
safely deploying machine learning models in the open world. However, previous methods …

Denoising diffusion probabilistic models

J Ho, A Jain, P Abbeel - Advances in neural information …, 2020 - proceedings.neurips.cc
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …

Mocogan: Decomposing motion and content for video generation

S Tulyakov, MY Liu, X Yang… - Proceedings of the IEEE …, 2018 - openaccess.thecvf.com
Visual signals in a video can be divided into content and motion. While content specifies
which objects are in the video, motion describes their dynamics. Based on this prior, we …

Learning generative vision transformer with energy-based latent space for saliency prediction

J Zhang, J **e, N Barnes, P Li - Advances in Neural …, 2021 - proceedings.neurips.cc
Vision transformer networks have shown superiority in many computer vision tasks. In this
paper, we take a step further by proposing a novel generative vision transformer with latent …

[PDF][PDF] Beef: Bi-compatible class-incremental learning via energy-based expansion and fusion

FY Wang, DW Zhou, L Liu, HJ Ye, Y Bian… - The eleventh …, 2022 - drive.google.com
Neural networks suffer from catastrophic forgetting when sequentially learning tasks phase-
by-phase, making them inapplicable in dynamically updated systems. Class-incremental …

Improved contrastive divergence training of energy based models

Y Du, S Li, J Tenenbaum, I Mordatch - arxiv preprint arxiv:2012.01316, 2020 - arxiv.org
Contrastive divergence is a popular method of training energy-based models, but is known
to have difficulties with training stability. We propose an adaptation to improve contrastive …

Learning latent space energy-based prior model

B Pang, T Han, E Nijkamp, SC Zhu… - Advances in Neural …, 2020 - proceedings.neurips.cc
We propose an energy-based model (EBM) in the latent space of a generator model, so that
the EBM serves as a prior model that stands on the top-down network of the generator …

On the anatomy of mcmc-based maximum likelihood learning of energy-based models

E Nijkamp, M Hill, T Han, SC Zhu, YN Wu - Proceedings of the AAAI …, 2020 - ojs.aaai.org
This study investigates the effects of Markov chain Monte Carlo (MCMC) sampling in
unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family of …

Generalized energy based models

M Arbel, L Zhou, A Gretton - arxiv preprint arxiv:2003.05033, 2020 - arxiv.org
We introduce the Generalized Energy Based Model (GEBM) for generative modelling. These
models combine two trained components: a base distribution (generally an implicit model) …

Residual energy-based models for text generation

Y Deng, A Bakhtin, M Ott, A Szlam… - arxiv preprint arxiv …, 2020 - arxiv.org
Text generation is ubiquitous in many NLP tasks, from summarization, to dialogue and
machine translation. The dominant parametric approach is based on locally normalized …