🐛 [Bug] Encountered bug when using torch.ops.aten.cat
converter
#1862
Labels
bug
Something isn't working
torch.ops.aten.cat
converter
#1862
Bug Description
When compiling the T5-Base Model model via the
aten
path, the following error is encountered:To Reproduce
Steps to reproduce the behavior:
T5Model.from_pretrained("t5-base").eval().cuda()
torch.randint(0, 1, (1, 14), dtype=torch.int32).to("cuda")
("input_ids", "attention_mask", "decoder_input_ids")transformers
tools to trace the model via:transformers.utils.fx.symbolic_trace(model, input_names=["input_ids", "attention_mask", "decoder_input_ids"])
Expected behavior
Model should compile via the
aten
pathEnvironment
python setup.py develop
The text was updated successfully, but these errors were encountered: