Skip to content

Commit 59dff9a

Browse files
NicolasHugYosuaMichael
authored andcommitted
[fbsync] fix Swin Transformer inplace mutation (#6266)
Summary: * fix inplace mutation * Different attn shouldn't share the same attribute * a simpler solution Reviewed By: jdsgomes Differential Revision: D37993419 fbshipit-source-id: 2a08a62168c4e6ee6c5a2ca934de88aa04361016 Co-authored-by: YosuaMichael <[email protected]>
1 parent ac073ce commit 59dff9a

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

torchvision/models/swin_transformer.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,7 @@ def shifted_window_attention(
106106
x = F.pad(input, (0, 0, 0, pad_r, 0, pad_b))
107107
_, pad_H, pad_W, _ = x.shape
108108

109+
shift_size = shift_size.copy()
109110
# If window size is larger than feature size, there is no need to shift window
110111
if window_size[0] >= pad_H:
111112
shift_size[0] = 0

0 commit comments

Comments
 (0)