Skip to content

Commit 1411b33

Browse files
authored
Update src/diffusers/models/transformers/transformer_wan.py
1 parent 74e34e5 commit 1411b33

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Diff for: src/diffusers/models/transformers/transformer_wan.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -537,7 +537,7 @@ def apply_rotary_emb(hidden_states: torch.Tensor, freqs: torch.Tensor):
537537
query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False
538538
)
539539
else:
540-
# Perturbed attention applied only when self-attention
540+
# Perturbed attention applied only to self-attention path
541541
hidden_states = value
542542

543543
hidden_states = hidden_states.transpose(1, 2).flatten(2, 3)

0 commit comments

Comments
 (0)