Optimizing diffusion noise can serve as universal motion priors

K Karunratanakul, K Preechakul… - Proceedings of the …, 2024‏ - openaccess.thecvf.com
Abstract We propose Diffusion Noise Optimization (DNO) a new method that effectively
leverages existing motion diffusion models as motion priors for a wide range of motion …

Gradient guidance for diffusion models: An optimization perspective

Y Guo, H Yuan, Y Yang, M Chen… - Advances in Neural …, 2025‏ - proceedings.neurips.cc
Diffusion models have demonstrated empirical successes in various applications and can
be adapted to task-specific needs via guidance. This paper studies a form of gradient …

Diffusion posterior sampling for linear inverse problem solving: A filtering perspective

Z Dou, Y Song - The Twelfth International Conference on Learning …, 2024‏ - openreview.net
Diffusion models have achieved tremendous success in generating high-dimensional data
like images, videos and audio. These models provide powerful data priors that can solve …

Provably robust score-based diffusion posterior sampling for plug-and-play image reconstruction

X Xu, Y Chi - Advances in Neural Information Processing …, 2025‏ - proceedings.neurips.cc
In a great number of tasks in science and engineering, the goal is to infer an unknown image
from a small number of noisy measurements collected from a known forward model …

Learning diffusion priors from observations by expectation maximization

F Rozet, G Andry, F Lanusse… - Advances in Neural …, 2025‏ - proceedings.neurips.cc
Diffusion models recently proved to be remarkable priors for Bayesian inverse problems.
However, training these models typically requires access to large amounts of clean data …

Tfg: Unified training-free guidance for diffusion models

H Ye, H Lin, J Han, M Xu, S Liu… - Advances in …, 2025‏ - proceedings.neurips.cc
Given an unconditional diffusion model and a predictor for a target property of interest (eg, a
classifier), the goal of training-free guidance is to generate samples with desirable target …

Iterated denoising energy matching for sampling from boltzmann densities

T Akhound-Sadegh, J Rector-Brooks, AJ Bose… - arxiv preprint arxiv …, 2024‏ - arxiv.org
Efficiently generating statistically independent samples from an unnormalized probability
distribution, such as equilibrium samples of many-body systems, is a foundational problem …

Amortizing intractable inference in diffusion models for vision, language, and control

S Venkatraman, M Jain, L Scimeca, M Kim… - arxiv preprint arxiv …, 2024‏ - arxiv.org
Diffusion models have emerged as effective distribution estimators in vision, language, and
reinforcement learning, but their use as priors in downstream tasks poses an intractable …

Erasediff: Erasing data influence in diffusion models

J Wu, T Le, M Hayat, M Harandi - arxiv preprint arxiv:2401.05779, 2024‏ - arxiv.org
We introduce EraseDiff, an unlearning algorithm designed for diffusion models to address
concerns related to data memorization. Our approach formulates the unlearning task as a …

Guidance with spherical gaussian constraint for conditional diffusion

L Yang, S Ding, Y Cai, J Yu, J Wang, Y Shi - arxiv preprint arxiv …, 2024‏ - arxiv.org
Recent advances in diffusion models attempt to handle conditional generative tasks by
utilizing a differentiable loss function for guidance without the need for additional training …