Optimizing diffusion noise can serve as universal motion priors
Abstract We propose Diffusion Noise Optimization (DNO) a new method that effectively
leverages existing motion diffusion models as motion priors for a wide range of motion …
leverages existing motion diffusion models as motion priors for a wide range of motion …
Gradient guidance for diffusion models: An optimization perspective
Diffusion models have demonstrated empirical successes in various applications and can
be adapted to task-specific needs via guidance. This paper studies a form of gradient …
be adapted to task-specific needs via guidance. This paper studies a form of gradient …
Diffusion posterior sampling for linear inverse problem solving: A filtering perspective
Diffusion models have achieved tremendous success in generating high-dimensional data
like images, videos and audio. These models provide powerful data priors that can solve …
like images, videos and audio. These models provide powerful data priors that can solve …
Provably robust score-based diffusion posterior sampling for plug-and-play image reconstruction
In a great number of tasks in science and engineering, the goal is to infer an unknown image
from a small number of noisy measurements collected from a known forward model …
from a small number of noisy measurements collected from a known forward model …
Learning diffusion priors from observations by expectation maximization
Diffusion models recently proved to be remarkable priors for Bayesian inverse problems.
However, training these models typically requires access to large amounts of clean data …
However, training these models typically requires access to large amounts of clean data …
Tfg: Unified training-free guidance for diffusion models
Given an unconditional diffusion model and a predictor for a target property of interest (eg, a
classifier), the goal of training-free guidance is to generate samples with desirable target …
classifier), the goal of training-free guidance is to generate samples with desirable target …
Iterated denoising energy matching for sampling from boltzmann densities
Efficiently generating statistically independent samples from an unnormalized probability
distribution, such as equilibrium samples of many-body systems, is a foundational problem …
distribution, such as equilibrium samples of many-body systems, is a foundational problem …
Amortizing intractable inference in diffusion models for vision, language, and control
Diffusion models have emerged as effective distribution estimators in vision, language, and
reinforcement learning, but their use as priors in downstream tasks poses an intractable …
reinforcement learning, but their use as priors in downstream tasks poses an intractable …
Erasediff: Erasing data influence in diffusion models
We introduce EraseDiff, an unlearning algorithm designed for diffusion models to address
concerns related to data memorization. Our approach formulates the unlearning task as a …
concerns related to data memorization. Our approach formulates the unlearning task as a …
Guidance with spherical gaussian constraint for conditional diffusion
Recent advances in diffusion models attempt to handle conditional generative tasks by
utilizing a differentiable loss function for guidance without the need for additional training …
utilizing a differentiable loss function for guidance without the need for additional training …