Skip to content

[attention] Fix attention #2656

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 13, 2023
Merged

[attention] Fix attention #2656

merged 3 commits into from
Mar 13, 2023

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Mar 13, 2023

Make sure that we don't pass attention_mask (which is only intended for self attention) to the cross attention class. This was incorrectly implemented in terms of naming and luckly isn't used yet. This PR introduces a new encoder_attention_mask function argument that should be used instead when using attention masks for Stable Diffusion's text embeddings.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Mar 13, 2023

The documentation is not available anymore as the PR was closed or merged.

@patrickvonplaten patrickvonplaten merged commit 4ae54b3 into main Mar 13, 2023
@patrickvonplaten patrickvonplaten deleted the correct_attention_mask branch March 13, 2023 18:10
w4ffl35 pushed a commit to w4ffl35/diffusers that referenced this pull request Apr 14, 2023
* [attention] Fix attention

* fix

* correct
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* [attention] Fix attention

* fix

* correct
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* [attention] Fix attention

* fix

* correct
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants