Skip to content

Constant folding on Torch Dialect #1182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ZihengJiang opened this issue Aug 9, 2022 · 2 comments
Closed

Constant folding on Torch Dialect #1182

ZihengJiang opened this issue Aug 9, 2022 · 2 comments

Comments

@ZihengJiang
Copy link
Collaborator

ZihengJiang commented Aug 9, 2022

After converting a PyTorch model to Torch dialect, some parts are converted to this:

    %6 = torch.vtensor.literal(dense<5> : tensor<si64>) : !torch.vtensor<[],si64>
    ...
    %451 = torch.prim.NumToTensor.Scalar %int5 : !torch.int -> !torch.vtensor<[],si64>
    %452 = torch.aten.div.Tensor_mode %451, %6, %str : !torch.vtensor<[],si64>, !torch.vtensor<[],si64>, !torch.str -> !torch.vtensor<[],si64>
    %453 = torch.aten.Int.Tensor %452 : !torch.vtensor<[],si64> -> !torch.int
    %454 = torch.prim.ListConstruct %453, %int5, %int-1 : (!torch.int, !torch.int, !torch.int) -> !torch.list<int>
    %455 = torch.aten.view %450, %454 : !torch.vtensor<[5,2048],f32>, !torch.list<int> -> !torch.vtensor<[?,5,?],f32>

As you can see, most of them can be pre-computed since the inputs are all constant integers, then we can get a static shaped %455.

Any idea for this?

@silvasean
Copy link
Contributor

We already partially support this. It requires an extension for "div". We can generalize the logic here: #935

@ZihengJiang
Copy link
Collaborator Author

This issue is solved by #1209 . Thanks @Vremold .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants