Skip to content

different config files for train and predict benchmarks #954

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Sep 23, 2018
Merged

different config files for train and predict benchmarks #954

merged 2 commits into from
Sep 23, 2018

Conversation

Anipik
Copy link
Contributor

@Anipik Anipik commented Sep 19, 2018

Fixes #982

  • different Config files for train and test
  • solves problem of long running time
  • train benchmarks contain only one iteration as it gives more idea on how the users will use. (with no warmup iteration)
  • predict config is the original version

cc @danmosemsft @eerhardt @adamsitnik @justinormont

Copy link
Member

@adamsitnik adamsitnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

.With(Job.Default
.WithWarmupCount(0)
.WithIterationCount(1)
.WithLaunchCount(3)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: you could add a comment here saying that BDN will: start 3 dedicated processes, each of them will just run given benchmark once, without any warm up to mimic the real world scenarios as suggested by @justinormont

@Anipik
Copy link
Contributor Author

Anipik commented Sep 21, 2018

@justinormont can you please review this one ?

Copy link
Member

@eerhardt eerhardt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just two comments. Rest looks good.

@markusweimer
Copy link
Member

This PR does not reference an issue. Can you please file one and reference it in the PR description?

@Anipik Anipik closed this Sep 21, 2018
@Anipik Anipik reopened this Sep 21, 2018
Copy link
Contributor

@justinormont justinormont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

Can you paste in the output (specifically to see the runtimes) of the benchmarks?

What was the runtime before/after the changes?

@Anipik
Copy link
Contributor Author

Anipik commented Sep 21, 2018

Can you paste in the output (specifically to see the runtimes) of the benchmarks?

will update the pR with the latest number

@Anipik
Copy link
Contributor Author

Anipik commented Sep 23, 2018

After

Method Mean Error StdDev Allocated
CV_Multiclass_WikiDetox_BigramsAndTrichar_OVAAveragedPerceptron 101.2 s 16.31 s 0.9213 s 771.41 KB
CV_Multiclass_WikiDetox_BigramsAndTrichar_LightGBMMulticlass 308.3 s 108.40 s 6.1247 s 792.85 KB
CV_Multiclass_WikiDetox_WordEmbeddings_OVAAveragedPerceptron 311.5 s 24.24 s 1.3695 s 781.23 KB
CV_Multiclass_WikiDetox_WordEmbeddings_SDCAMC 193.9 s 19.42 s 1.0975 s 772.61 KB
TrainTest_Ranking_MSLRWeb10K_RawNumericFeatures_FastTreeRanking 31.33 s 3.698 s 0.2089 s 15418.98 MB
TrainTest_Ranking_MSLRWeb10K_RawNumericFeatures_LightGBMRanking 28.93 s 19.583 s 1.1065 s 208.93 MB
Test_Ranking_MSLRWeb10K_RawNumericFeatures_FastTreeRanking 1.053 s 0.0389 s 0.0432 s 3.1 MB
Test_Multiclass_WikiDetox_BigramsAndTrichar_OVAAveragedPerceptron 5.337 s 0.1048 s 0.1165 s 238.23 MB

@Anipik
Copy link
Contributor Author

Anipik commented Sep 23, 2018

@justinormont i updated the numbers for the tests we changed congifuration. can we go ahead and merge this one ?

@Anipik
Copy link
Contributor Author

Anipik commented Sep 23, 2018

The before numbers are in the email thread @justinormont

@justinormont justinormont merged commit b88cc09 into dotnet:master Sep 23, 2018
@adamsitnik adamsitnik mentioned this pull request Sep 24, 2018
@Anipik Anipik deleted the config2 branch October 10, 2018 18:23
@ghost ghost locked as resolved and limited conversation to collaborators Mar 28, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants