A review of change of variable formulas for generative modeling
U Köthe - arxiv preprint arxiv:2308.02652, 2023 - arxiv.org
Change-of-variables (CoV) formulas allow to reduce complicated probability densities to
simpler ones by a learned transformation with tractable Jacobian determinant. They are thus …
simpler ones by a learned transformation with tractable Jacobian determinant. They are thus …
Free-form flows: Make any architecture a normalizing flow
Normalizing Flows are generative models that directly maximize the likelihood. Previously,
the design of normalizing flows was largely constrained by the need for analytical …
the design of normalizing flows was largely constrained by the need for analytical …
A Pseudoreversible Normalizing Flow for Stochastic Dynamical Systems with Various Initial Distributions
We present a pseudoreversible normalizing flow method for efficiently generating samples
of the state of a stochastic differential equation (SDE) with various initial distributions. The …
of the state of a stochastic differential equation (SDE) with various initial distributions. The …
Lifting architectural constraints of injective flows
Normalizing Flows explicitly maximize a full-dimensional likelihood on the training data.
However, real data is typically only supported on a lower-dimensional manifold leading the …
However, real data is typically only supported on a lower-dimensional manifold leading the …
Training energy-based normalizing flow with score-matching objectives
In this paper, we establish a connection between the parameterization of flow-based and
energy-based generative models, and present a new flow-based modeling approach called …
energy-based generative models, and present a new flow-based modeling approach called …
Semi-autoregressive energy flows: exploring likelihood-free training of normalizing flows
Training normalizing flow generative models can be challenging due to the need to
calculate computationally expensive determinants of Jacobians. This paper studies the …
calculate computationally expensive determinants of Jacobians. This paper studies the …
Maximum Likelihood Training of Autoencoders
Maximum likelihood training has favorable statistical properties and is popular for generative
modeling, especially with normalizing flows. On the other hand, generative autoencoders …
modeling, especially with normalizing flows. On the other hand, generative autoencoders …
Deep generative model based rate-distortion for image downscaling assessment
In this paper we propose Image Downscaling Assessment by Rate-Distortion (IDA-RD) a
novel measure to quantitatively evaluate image downscaling algorithms. In contrast to image …
novel measure to quantitatively evaluate image downscaling algorithms. In contrast to image …
Exact, Tractable Gauss-Newton Optimization in Deep Reversible Architectures Reveal Poor Generalization
Second-order optimization has been shown to accelerate the training of deep neural
networks in many applications, often yielding faster progress per iteration on the training …
networks in many applications, often yielding faster progress per iteration on the training …
Maximum Entropy Reinforcement Learning via Energy-Based Normalizing Flow
Existing Maximum-Entropy (MaxEnt) Reinforcement Learning (RL) methods for continuous
action spaces are typically formulated based on actor-critic frameworks and optimized …
action spaces are typically formulated based on actor-critic frameworks and optimized …