Skip to content

OnnxTransform -- Update to OnnxRuntime 0.2.0 #2085

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 21 commits into from
Jan 31, 2019

Conversation

jignparm
Copy link
Contributor

@jignparm jignparm commented Jan 9, 2019

Add Linux support for OnnxTransform.

Fixes #2056
Fixes #2106
Fixes #2149

@codecov
Copy link

codecov bot commented Jan 29, 2019

Codecov Report

Merging #2085 into master will increase coverage by <.01%.
The diff coverage is 58.82%.

@@            Coverage Diff             @@
##           master    #2085      +/-   ##
==========================================
+ Coverage   71.15%   71.16%   +<.01%     
==========================================
  Files         779      780       +1     
  Lines      140311   140338      +27     
  Branches    16047    16043       -4     
==========================================
+ Hits        99838    99868      +30     
+ Misses      36021    36019       -2     
+ Partials     4452     4451       -1
Flag Coverage Δ
#Debug 71.16% <58.82%> (ø) ⬆️
#production 67.57% <0%> (ø) ⬆️
#test 85.09% <71.42%> (+0.01%) ⬆️

@@ -7,7 +7,7 @@

<ItemGroup>
<ProjectReference Include="../Microsoft.ML/Microsoft.ML.nupkgproj" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.Gpu" Version="$(MicrosoftMLOnnxRuntimeGpuPackageVersion)"/>
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="$(MicrosoftMLOnnxRuntimePackageVersion)"/>
Copy link
Member

@eerhardt eerhardt Jan 29, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why aren't we using the GPU package anymore? #Resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GPU package will be available on Nuget.Org. In particular for Linux, CPU execution is blocked if the libraries are not available.
Using the CPU package will enable Linux testing as well, if an Ubuntu CI leg is available in the future.


In reply to: 251941810 [](ancestors = 251941810)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m not understanding this. Will ML.NET use the GPU package or not? Why did we switch to the GPU package before, only to switch back off the GPU package now?

Copy link
Contributor Author

@jignparm jignparm Jan 30, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The primary reason is that we should depend on a package which will run on multiple platforms without additional requirements.The OnnxRuntime GPU package won't run on Linux without the CUDA libraries installed, even if a user wants to just run on CPU. To avoid confusion, we should go with a default of a CPU package, which will run cross-platform (Ubuntu Linux at least, and MacOs in future) without requiring any extra user installation. To use the GPU functionality, we can request users to install CUDA and the GPU OnnxRuntime package. When we add MacOS support, the runtime will be available only in the CPU package (there's no GPU runtime for Mac), so long term using the CPU package seems like a better option.

Most other frameworks also have 2 separate packages, and for the GPU package request CUDA to be installed. This solves package size issues, and also simplifies licensing.

See the MXNet GPU installation here --> https://mxnet.incubator.apache.org/versions/master/install/windows_setup.html#install-with-gpus.
And Theano as well --> http://deeplearning.net/software/theano/install_windows.html.

The OnnxTransform's XML documentation currently says to use the Microsoft.ML.OnnxRuntime.Gpu package if users want to run on GPU. Currently they'll need to rebuild MLNET from source to do this. We can add an optional OnnxTransform-Gpu as a separate transform, which uses the Gpu package. To test for it properly however, we'd need to add a GPU CI leg.


In reply to: 252102461 [](ancestors = 252102461)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, this works for now. I agree it is best to have our "default" work in as many places as possible.

documentation currently says to use the Microsoft.ML.OnnxRuntime.Gpu package if users want to run on GPU. Currently they'll need to rebuild MLNET from source to do this.

We typically don't take this route in .NET libraries*. Instead we typically provide all options in binary form, i.e. add an optional OnnxTransform-Gpu separate package/transform.

(*) Reasons for this:

  1. It means the files are no longer signed by Microsoft.
  2. It messes with dependencies, because now if someone wanted to ship a package that depended on the OnnxTransform.Gpu package, they would need to redistribute it themselves. And if 2 people wanted to do it, they would have separate packages.

@jignparm jignparm changed the title [WIP] OnnxTransform Linux x64 support OnnxTransform Linux x64 support Jan 29, 2019
@jignparm jignparm changed the title OnnxTransform Linux x64 support OnnxTransform -- Update to OnnxRuntime 0.2.0 Jan 29, 2019
Copy link
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My main concern right now is the introduction of the sentinel value NullGpuID, when we already have a value that will work just fine - null. I'd like to see that addressed before merging.

My concern about the tests can be fixed in a subsequent PR.

Copy link
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a couple things to clean up.

Copy link

@shmoradims shmoradims left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@Ivanidzo4ka
Copy link
Contributor

Ivanidzo4ka commented Jan 31, 2019

minor change to kick off build

You don't have to do it, you can always restart or retry build via UI.
image

@jignparm jignparm merged commit 46efb8e into dotnet:master Jan 31, 2019
@jignparm jignparm deleted the jignparm/onnx_linx_support branch January 31, 2019 02:34
@ghost ghost locked as resolved and limited conversation to collaborators Mar 25, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants