Diffusers contains multiple pre-built schedule functions for the diffusion process.
The schedule functions, denoted Schedulers in the library take in the output of a trained model, a sample which the diffusion process is iterating on, and a timestep to return a denoised sample.
- Schedulers define the methodology for iteratively adding noise to an image or for updating a sample based on model outputs.
- adding noise in different manners represent the algorithmic processes to train a diffusion model by adding noise to images.
- for inference, the scheduler defines how to update a sample based on an output from a pretrained model.
- Schedulers are often defined by a noise schedule and an update rule to solve the differential equation solution.
All schedulers take in a timestep to predict the updated version of the sample being diffused.
The timesteps dictate where in the diffusion process the step is, where data is generated by iterating forward in time and inference is executed by propagating backwards through timesteps.
Different algorithms use timesteps that both discrete (accepting int
inputs), such as the [DDPMScheduler
] or [PNDMScheduler
], and continuous (accepting float
inputs), such as the score-based schedulers [ScoreSdeVeScheduler
] or [ScoreSdeVpScheduler
].
The core design principle between the schedule functions is to be model, system, and framework independent. This allows for rapid experimentation and cleaner abstractions in the code, where the model prediction is separated from the sample update. To this end, the design of schedulers is such that:
- Schedulers can be used interchangeably between diffusion models in inference to find the preferred trade-off between speed and generation quality.
- Schedulers are currently by default in PyTorch, but are designed to be framework independent (partial Jax support currently exists).
The core API for any new scheduler must follow a limited structure.
- Schedulers should provide one or more
def step(...)
functions that should be called to update the generated sample iteratively. - Schedulers should provide a
set_timesteps(...)
method that configures the parameters of a schedule function for a specific inference task. - Schedulers should be framework-specific.
The base class [SchedulerMixin
] implements low level utilities used by multiple schedulers.
[[autodoc]] SchedulerMixin
The class [SchedulerOutput
] contains the outputs from any schedulers step(...)
call.
[[autodoc]] schedulers.scheduling_utils.SchedulerOutput
Original paper can be found here.
[[autodoc]] DDIMScheduler
Original paper can be found here.
[[autodoc]] DDPMScheduler
Original paper can be found here and the improved version. The original implementation can be found here.
[[autodoc]] DPMSolverMultistepScheduler
Original paper can be found here.
[[autodoc]] KarrasVeScheduler
Original implementation can be found here.
[[autodoc]] LMSDiscreteScheduler
Original implementation can be found here.
[[autodoc]] PNDMScheduler
Original paper can be found here.
[[autodoc]] ScoreSdeVeScheduler
Original implementation can be found here.
[[autodoc]] IPNDMScheduler
Original paper can be found here.
Score SDE-VP is under construction.
[[autodoc]] schedulers.scheduling_sde_vp.ScoreSdeVpScheduler
Euler scheduler (Algorithm 2) from the paper Elucidating the Design Space of Diffusion-Based Generative Models by Karras et al. (2022). Based on the original k-diffusion implementation by Katherine Crowson. Fast scheduler which often times generates good outputs with 20-30 steps.
[[autodoc]] EulerDiscreteScheduler
Ancestral sampling with Euler method steps. Based on the original (k-diffusion)[https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L72] implementation by Katherine Crowson. Fast scheduler which often times generates good outputs with 20-30 steps.
[[autodoc]] EulerAncestralDiscreteScheduler
Original paper can be found here
[[autodoc]] VQDiffusionScheduler
DDPM-based inpainting scheduler for unsupervised inpainting with extreme masks.
Intended for use with [RePaintPipeline
].
Based on the paper RePaint: Inpainting using Denoising Diffusion Probabilistic Models
and the original implementation by Andreas Lugmayr et al.: https://github.com/andreas128/RePaint
[[autodoc]] RePaintScheduler