Skip to content

[ML][Inference] changing setting to be memorySizeSettting #49259

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

benwtrent
Copy link
Member

While going through the settings and experimenting, I found that having xpack.ml.inference_model.cache_size strictly be a byteSized value to be problematic.

What if the user provides a byte sized value that is more than the heap?
What if the heap of the node is less than the default value?

Having it as a memorySizeSetting allows users and us to set the value as a percentage of heap or a static bytesized value.

@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@benwtrent benwtrent force-pushed the feature/ml-inference-validate-model-cache-setting branch from ba67ce1 to 4b52cbe Compare November 18, 2019 18:32
@benwtrent
Copy link
Member Author

run elasticsearch-ci/packaging-sample-matrix

Copy link
Contributor

@droberts195 droberts195 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@droberts195
Copy link
Contributor

I noticed this setting isn't in the docs. Please can you add it. (You can do this in a subsequent PR so this one can be merged while it's green.)

@benwtrent benwtrent merged commit f27511c into elastic:master Nov 19, 2019
@benwtrent benwtrent deleted the feature/ml-inference-validate-model-cache-setting branch November 19, 2019 12:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants