Replies: 1 comment
-
You can embed preprocessing function into your model and then convert to ONNX class InferenceModel(torch.nn.Module):
def __init__(self, my_model, mean, std):
super().__init__()
self.my_model = my_model
self.register_buffer("mean", mean)
self.register_buffer("std", std)
def forward(self, x):
x = x - self.mean
x = x / self.std
x = self.my_model(x)
return x |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I found the implementation very efficient and strainght forward. However I would like to know if you can provide me hint for doing inference with OPENCV.
So I can convert the model to ONNX, however how should I handle the preprocessing function?
Any small code will suffice.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions