Instaflow: One step is enough for high-quality diffusion-based text-to-image generation

X Liu, X Zhang, J Ma, J Peng - The Twelfth International …, 2023 - openreview.net
Diffusion models have revolutionized text-to-image generation with its exceptional quality
and creativity. However, its multi-step sampling process is known to be slow, often requiring …

Fine-tuning discrete diffusion models via reward optimization with applications to dna and protein design

C Wang, M Uehara, Y He, A Wang, T Biancalani… - arxiv preprint arxiv …, 2024 - arxiv.org
Recent studies have demonstrated the strong empirical performance of diffusion models on
discrete sequences across domains from natural language to biological sequence …

Improving in-context learning in diffusion models with visual context-modulated prompts

T Chen, Y Liu, Z Wang, J Yuan, Q You, H Yang… - arxiv preprint arxiv …, 2023 - arxiv.org
In light of the remarkable success of in-context learning in large language models, its
potential extension to the vision domain, particularly with visual foundation models like …

Long and Short Guidance in Score identity Distillation for One-Step Text-to-Image Generation

M Zhou, Z Wang, H Zheng, H Huang - arxiv preprint arxiv:2406.01561, 2024 - arxiv.org
Diffusion-based text-to-image generation models trained on extensive text-image pairs have
shown the capacity to generate photorealistic images consistent with textual descriptions …

Paramrel: Learning parameter space representation via progressively encoding Bayesian flow networks

Z Wu, X Fan, J Li, Z Zhao, H Chen, L Cao - arxiv preprint arxiv …, 2024 - arxiv.org
The recently proposed Bayesian Flow Networks~(BFNs) show great potential in modeling
parameter spaces, offering a unified strategy for handling continuous, discretized, and …

Score Forgetting Distillation: A Swift, Data-Free Method for Machine Unlearning in Diffusion Models

T Chen, S Zhang, M Zhou - arxiv preprint arxiv:2409.11219, 2024 - arxiv.org
The machine learning community is increasingly recognizing the importance of fostering
trust and safety in modern generative AI (GenAI) models. We posit machine unlearning (MU) …

Advancing Graph Generation through Beta Diffusion

X Liu, Y He, B Chen, M Zhou - arxiv preprint arxiv:2406.09357, 2024 - arxiv.org
Diffusion models have excelled in generating natural images and are now being adapted to
a variety of data types, including graphs. However, conventional models often rely on …

Marked Temporal Bayesian Flow Point Processes

H Chen, X Fan, H Liu, L Cao - arxiv preprint arxiv:2410.19512, 2024 - arxiv.org
Marked event data captures events by recording their continuous-valued occurrence
timestamps along with their corresponding discrete-valued types. They have appeared in …

Logistic-beta processes for modeling dependent random probabilities with beta marginals

CJ Lee, A Zito, H Sang, DB Dunson - arxiv preprint arxiv:2402.07048, 2024 - arxiv.org
The beta distribution serves as a canonical tool for modeling probabilities and is extensively
used in statistics and machine learning, especially in the field of Bayesian nonparametrics …