-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Implement .swap()
against diffusers 0.12
#2385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement .swap()
against diffusers 0.12
#2385
Conversation
c2183b6
to
313b206
Compare
…github.com:damian0815/InvokeAI into diffusers_cross_attention_control_reimplementation
MPS .swap is non-functional until kulinseth/pytorch#222 is merged |
Ok I think this is good. Can i get some testing support on Windows and Linux please?
|
…ention_control_reimplementation
Oh, do we get to use more than one operation now? The previous implementation was limited to one, I thought. |
I never tested two before, so I didn’t know that was a limitation. All ok
then.
On Sat, Jan 28, 2023 at 4:55 PM Kevin Turner ***@***.***> wrote:
mother and daughter.swap(son) having lunch.swap(dinner)
Oh, do we get to use more than one operation now? The previous
implementation was limited to one, I thought.
—
Reply to this email directly, view it on GitHub
<#2385 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAA3EVLUFAKZACM6GBS4IWLWUWIW3ANCNFSM6AAAAAAUCR7FKE>
.
You are receiving this because you commented.Message ID:
***@***.***>
--
Written on my cell phone. Anything that seems odd is the fault of
auto-correct.
|
Testing on Linux, results seem poor. The image with Starting with using SD 1.5, DDIM, 25 steps |
@keturn can you try with |
@damian0815 Perhaps t_start should default to something like 0.2 so there's a visual difference? |
Interesting. Setting I guess this will probably all be more comprehensible once we get the attention map visualizations back, huh? |
Same goes for the test prompt I got from hipsterusername a while back:
With default settings [ |
This warning still pops up in the log: "warning: cross-attention control options are not working properly for >1 edit" but using multiple swaps definitely does do stuff. Is it warning us that you can have multiple edits but not have independent values of t_start/t_end for them? |
yep, t_start should default to something >=1 step. probably 1 step would be fine. i wonder if
yes, that's the warning. i do want to eventually address that - should be more clear how now that i've broken off |
…github.com:damian0815/InvokeAI into diffusers_cross_attention_control_reimplementation
ok no skipping the first step is a bad idea, i'll just make the default 0.1 |
…st step" This reverts commit 27ee939.
i took the liberty of ticking the |
…github.com:damian0815/InvokeAI into diffusers_cross_attention_control_reimplementation
@keturn i also took the liberty of "resolving" the concerns you raised re: the naming of the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've now experimented with some high-RAM operations before and after running the swap
to confirm that it does indeed put the memory-efficient attention settings back correctly, and that's working well for me both with xformers and without. ✔️
There are still minor details I'm unclear on (like why you can pass None to restore_default_cross_attention
), but overall this is a huge improvement to the stability of the cross-attention code with diffusers 0.12 and I think it's good to merge. 👍
Re-implementation of
.swap()
for diffusers 0.12's newCrossAttnProcessor
API.needs diffusers 0.12
pip install https://github.com/huggingface/diffusers
currently only tested/working on mac CPU (
invoke.py --always_use_cpu
).todo:
CrossAttnProcessor
after it quits - it should automatically go back to eg xformers if that's what was there before doing a.swap()