Skip to content

Separate Sigma Schedule #10146

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 21 commits into from
Closed

Conversation

hlky
Copy link
Contributor

@hlky hlky commented Dec 7, 2024

What does this PR do?

This is not a finalised design, just a demonstration of how sigma/noise schedule can be moved out of schedulers, comments and feedback are encouraged.

Usage:

from diffusers import HeunDiscreteScheduler
from diffusers.schedulers.sigmas import BetaSigmas, ExponentialSigmas, KarrasSigmas

sigma_schedule = ExponentialSigmas()
scheduler = HeunDiscreteScheduler.from_pretrained(
    "stabilityai/stable-diffusion-xl-base-1.0",
    subfolder="scheduler",
    sigma_schedule=sigma_schedule,
)

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

cc @yiyixuxu

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@hlky hlky force-pushed the separate-sigma-schedule branch from e4daabb to 8703cdc Compare December 18, 2024 13:32
from ..sigmas.exponential_sigmas import ExponentialSigmas
from ..sigmas.karras_sigmas import KarrasSigmas

class FlowMatchSD3:
Copy link
Collaborator

@yiyixuxu yiyixuxu Dec 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there any reason, the flow match sigmas are treated differently from beta/exponential/karras? they all generate a sigma schedule, no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Beta/exponential/karras are more like a conversion of existing sigma schedule, the original functions are actually called _convert_to_*

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FlowMatchEuler currently has what we're calling FlowMatchSD3 here by default, and for other models we pass in sigmas, I think the one for SANA is different again, so it seems these will tend to be model specific, compared to everything in BetaSchedule like timestep_spacing == "linspace" interpolation_type == "linear" which is shared and we rarely pass in sigmas, the exception is Align Your Steps schedules which I don't think is used often. We could inline these like timestep_spacing == "linspace" in BetaSchedule but like this the idea is that we can pass a custom class that just needs a call to return the sigmas if we wanted to in addition to the ones that are built-in, new models, custom schedules and experimentation can be supported easier. We could refactor BetaSchedule in a similar way like LinspaceLinear etc and allow the same level of experimentation, it just seems less likely to be used and the experimentation with this kind of scheduler will be done with conversion, there are a few more less popular from the community that we don't support but again the idea is that it can be passed in as a custom class.

@hlky
Copy link
Contributor Author

hlky commented Dec 19, 2024

I've added some notes, sana's schedule and combined scale_noise (from FlowMatchEuler) with add_noise, they're the same except 1 extra multiplication for FlowMatch and the signature order is different. Trying to cover most things that will need changing so it can be reviewed at the same time. I'll look at some other schedulers like DDIM in case there's anything specific in those.

@@ -245,6 +245,34 @@ def from_config(cls, config: Union[FrozenDict, Dict[str, Any]] = None, return_un
deprecate("config-passed-as-path", "1.0.0", deprecation_message, standard_warn=False)
config, kwargs = cls.load_config(pretrained_model_name_or_path=config, return_unused_kwargs=True, **kwargs)

# Handle old scheduler configs
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will need to be robust and probably kept for a while unless we can find a way to mass update configs on the Hub. It's working with some scheduler configs already, FlowMatch vs Beta is detected with shift and beta_schedule, I've already found an edge case in SANA's config because we integrated those scheduler changes into DPM so it has beta_schedule and no shift (it was called flow_shift instead).

@hlky
Copy link
Contributor Author

hlky commented Dec 20, 2024

image

Important to note here that we don't expect all combinations of sampler + noise schedule to work for all models, especially Flow Match models, above is a table of compatibility with Flux from community UIs (Code for sampling is almost 1:1 between forge, comfy etc.)

@hlky
Copy link
Contributor Author

hlky commented Jan 8, 2025

Some code to demonstrate how schedulers work at this stage in the refactor.

from diffusers import EulerDiscreteScheduler, EulerAncestralDiscreteScheduler, HeunDiscreteScheduler
from diffusers.schedulers.schedules.beta_schedule import BetaSchedule
from diffusers.schedulers.schedules.flow_schedule import FlowMatchSchedule, FlowMatchSD3, FlowMatchFlux, FlowMatchSANA, FlowMatchHunyuanVideo, FlowMatchLinearQuadratic
from diffusers.schedulers.sigmas.beta_sigmas import BetaSigmas
from diffusers.schedulers.sigmas.exponential_sigmas import ExponentialSigmas
from diffusers.schedulers.sigmas.karras_sigmas import KarrasSigmas
import numpy as np

# mapped from FlowMatchEulerDiscreteScheduler
# Euler sampling with Flow Match schedule
euler = EulerDiscreteScheduler.from_pretrained("black-forest-labs/FLUX.1-dev", subfolder="scheduler")
# Heun sampling with Flow Match schedule
heun = HeunDiscreteScheduler.from_config(euler.config)
# Euler Ancestral actually gives noisy output with Flow Match
ancestral = EulerAncestralDiscreteScheduler.from_config(euler.config)

# currently, if this is used in Flux pipeline `base_schedule` is set to `FlowMatchFlux`
# SD3 base schedule
euler._schedule.base_schedule = FlowMatchSD3()
# SANA base schedule
euler._schedule.base_schedule = FlowMatchSANA()

class FlowMatchCustom:
    def __call__(self, num_inference_steps: int, **kwargs) -> np.ndarray:
        sigmas = np.linspace(1.0, 1 / num_inference_steps, num_inference_steps)
        half = num_inference_steps // 2
        sigmas[half:] = sigmas[half:] * 1.2
        return sigmas
euler._schedule.base_schedule = FlowMatchCustom()

# configs contain `base_schedule` as `str`
flow_schedule = FlowMatchSchedule(
    shift=13.0,
    use_dynamic_shifting=False,
    base_schedule="FlowMatchSD3"
)

# we can also use the class directly
flow_schedule = FlowMatchSchedule(
    shift=13.0,
    use_dynamic_shifting=False,
    base_schedule=FlowMatchCustom()
)

# Euler sampling with Beta schedule
euler = EulerDiscreteScheduler.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", subfolder="scheduler")

# Euler sampling with Beta schedule and Karras sigmas
euler = EulerDiscreteScheduler(
    schedule_config=BetaSchedule(
        beta_end=0.012,
        beta_schedule="scaled_linear",
        beta_start=0.00085,
        timestep_spacing="leading",
    ),
    sigma_schedule_config=KarrasSigmas(),
)

# Euler sampling with Beta schedule and Beta sigmas
euler._sigma_schedule = BetaSigmas()
# Euler sampling with Beta schedule and Exponential sigmas
euler._sigma_schedule = ExponentialSigmas()

Tests with black-forest-labs/FLUX.1-dev

Euler

Base Beta Exponential Karras
EulerDiscreteScheduler EulerDiscreteScheduler_BetaSigmas EulerDiscreteScheduler_ExponentialSigmas EulerDiscreteScheduler_KarrasSigmas

Euler Ancestral

Base Beta Exponential Karras
EulerAncestralDiscreteScheduler EulerAncestralDiscreteScheduler_BetaSigmas EulerAncestralDiscreteScheduler_ExponentialSigmas EulerAncestralDiscreteScheduler_KarrasSigmas

Heun

Beta Exponential
HeunDiscreteScheduler_BetaSigmas HeunDiscreteScheduler_ExponentialSigmas

Note: Base Heun and with Karras were tested, just lost the files from when I ran the test.

@hlky
Copy link
Contributor Author

hlky commented Jan 12, 2025

Hi @ukaprch. Thanks for your interest in Flow Match scheduling support. We know this is highly anticipated and we appreciate your patience while we work on it 🤗

@yiyixuxu
Copy link
Collaborator

yiyixuxu commented Jan 14, 2025

I think the idea is to separate sigma class out from the schedulerclass, it will be separate part of pipeline (i.e. you can change scheduler without change sigma, e.g. we change it at run time etc) - is it possible?

the current scheduler already accept sigmas but it is still depends on the scheduler config to convert to k-sigmas etc

this API here it is still part of the scheduler, just make it more configurable I think

sigma_schedule = ExponentialSigmas()
scheduler = HeunDiscreteScheduler.from_pretrained(
    "stabilityai/stable-diffusion-xl-base-1.0",
    subfolder="scheduler",
    sigma_schedule=sigma_schedule,
)

the ideal design should not require much change to current scheduler class IMO

@hlky
Copy link
Contributor Author

hlky commented Jan 14, 2025

That is what I was looking at initially, while you've been working on Modular I've added a further experimental design.

We can make sigma as in Beta, Exponential, Karras a separate part of the pipeline, it has to passed into the scheduler though as these conversions run before concatenate final (zero, min).


These can be used if we add a ConfigMixin etc.

Afaik some scheduler configs do specify use_karras_sigmas=True so will need to find a way to set e.g. KarrasSigmas on pipeline.

We could also do the same for Flow Match base schedules, add a ConfigMixin etc., the class computes sigmas that are passed into scheduler instead of just code in the pipelines like now.

class FlowMatchFlux:
def __call__(self, num_inference_steps: int, **kwargs) -> np.ndarray:
return np.linspace(1.0, 1 / num_inference_steps, num_inference_steps)

Copy link
Contributor

github-actions bot commented Feb 8, 2025

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot added the stale Issues that haven't received updates label Feb 8, 2025
@hlky hlky closed this Apr 15, 2025
@hlky hlky deleted the separate-sigma-schedule branch April 15, 2025 12:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
scheduler stale Issues that haven't received updates
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants