-
Notifications
You must be signed in to change notification settings - Fork 365
🐛 [Bug] KeyError after resolveNonTensorInputs #1018
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I have a fix for this that modifies |
Hi @mfeliz-cruise , thank you for your help! |
You should be able to open a PR without any special privileges. Create a fork of this repo on your github account and push your patch to that. Github will guide you in opening a PR to upstream. |
@bowang007 @narendasan thanks I'll try it. |
Bug Description
Torch-TensorRT attempts to resolve all non-tensor inputs of a torch block if any of those inputs are generated by tensorrt blocks. This leads to a failed attempt to resolve a dictionary input to a torch block that is generated by another torch block.
getDependencyNodes
fails to identify the aten::_set_item as a dependency which results in a KeyError.This is the original graph. This is a small artificial test case only intended to reproduce this issue.
It is segmented as follows. The
Tensor?[]
input to @2 from @1 will need to be resolved triggering resolution of all @2 inputs including%out_dict.1
which is a dictionary create in @0.After
resolveNonTensorInputs
we can see that theprim::DictConstruct()
node for%out_dict.1
is copied into @2 without the followingaten::_set_item
node.To Reproduce
Steps to reproduce the behavior:
Expected behavior
Torch-TensorRT should not attempt to resolve non-tensor inputs of torch blocks that are generated by torch blocks. If it does choose to resolve a dictionary input it should include
aten::_set_item
as a dependency.The text was updated successfully, but these errors were encountered: