Skip to content

Commit 9dd02d8

Browse files
[Bug] Fix usage of .transpose() and .view() consecutively. (#11979)
1 parent f7b3ba8 commit 9dd02d8

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

vllm/attention/layer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -230,7 +230,7 @@ def forward(
230230
value,
231231
scale=self.scale)
232232
out = out.transpose(1, 2)
233-
return out.view(bsz, q_len, -1)
233+
return out.reshape(bsz, q_len, -1)
234234

235235

236236
def unified_attention(

vllm/model_executor/models/intern_vit.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -271,7 +271,7 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
271271
v = v.transpose(1, 2)
272272

273273
x = F.scaled_dot_product_attention(q, k, v, scale=self.scale)
274-
x = x.transpose(1, 2).view(B, N, -1)
274+
x = x.transpose(1, 2).reshape(B, N, -1)
275275

276276
x = self.proj(x)
277277
return x

0 commit comments

Comments
 (0)