Paper Title

Consistency Models

Yang Song, Prafulla Dhariwal, Mark Chen, Mark Chen

In International Conference on Machine Learning, 2023

sampling

@article{song2023consistency,
title={Consistency models},
author={Song, Yang and Dhariwal, Prafulla and Chen, Mark and Sutskever, Ilya},
booktitle={International Conference on Machine Learning, {ICML}},
series={Proceedings of Machine Learning Research},
volume={202},
pages={32211--32252},
year={2023},
publisher={{PMLR}}
}

TL;DR: This paper introduces Consistency Models, a new family of models based on diffusion models that enables 1-step generation.

Table of Contents

1. Introduction

1. Introduction

Consistency models are a new family of models based on continuous-time diffusion models [Song et al. ICLR 2021], [Karras el al. NeurIPS 2022] that achieve 1-step generation.

2. Background

Continuous-time diffusion models [Song et al. ICLR 2021] are formulated with the stochastic differential equation (SDE): (Eq. 1)dxt=μ(xt,t) dtDeterministic Term+σ(t) dwtStochastic Term, where:

  • σ() is the noise schedule.
  • μ(,) is the drift function.
  • {wt}t[0,T] is a Wiener process or Brownian motion.
Being p(xt) the distribution of xt, note that p0(x)pdata(x).
From the SDE, and with the help of the Fokker-Planck equation, we can derive the Probability Flow ODE (PF-ODE) (for the derivation of [Eq. 2] check [Song et al. ICLR 2021]): (Eq. 2)dxt=[μ(xt,t) dt12σ(t)2 logpt(x)]dt, [Karras el al. NeurIPS 2022]

3. Consistency Models

(Eq. X)LCDN(θ,θ;ϕ)=E[λ(tn)d(fθ(xtn1,tn+1),fθ(x^ϕtn,tn))] rember that the expectation E is like a mean but without samples, so it is a theoretical mean of the distribution.

© Copyright ©2025 All rights reserved | Carlos Hernández Oliván | Colorlib