Skip to content

Commit 418d8a6

Browse files
fix Swin Transformer inplace mutation (#6266)
* fix inplace mutation * Different attn shouldn't share the same attribute * a simpler solution Co-authored-by: YosuaMichael <[email protected]>
1 parent 77940b8 commit 418d8a6

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

torchvision/models/swin_transformer.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,7 @@ def shifted_window_attention(
106106
x = F.pad(input, (0, 0, 0, pad_r, 0, pad_b))
107107
_, pad_H, pad_W, _ = x.shape
108108

109+
shift_size = shift_size.copy()
109110
# If window size is larger than feature size, there is no need to shift window
110111
if window_size[0] >= pad_H:
111112
shift_size[0] = 0

0 commit comments

Comments
 (0)