Skip to content

Better error message when an inference model cannot be parsed due to its size #59166

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 8, 2020

Conversation

davidkyle
Copy link
Member

When parsing model definitions there is sensibly a limit on how large the stream size can be. On such an error the actual reason will be at the end of long chain of XContentParseExceptions

{'error': {'root_cause': [{'type': 'x_content_parse_exception', 'reason': '[1:107374158] [tree_inference_model_node] failed to parse object'}], 'type': 'x_content_parse_exception', 'reason': '[1:107374158] [inference_model_definition] failed to parse field [trained_model]', 'caused_by': {'type': 'x_content_parse_exception', 'reason': '[1:107374158] [trained_model] failed to parse field [ensemble]', 'caused_by': {'type': 'x_content_parse_exception', 'reason': '[1:107374158] [ensemble_inference_model] failed to parse field [trained_models]', 'caused_by': {'type': 'x_content_parse_exception', 'reason': '[1:107374158] [trained_models] failed to parse field [tree]', 'caused_by': {'type': 'x_content_parse_exception', 'reason': '[1:107374158] [tree_inference_model] failed to parse field [tree_structure]', 'caused_by': {'type': 'x_content_parse_exception', 'reason': '[1:107374158] [tree_inference_model_node] failed to parse object', 'caused_by': {'type': 'i_o_exception', 'reason': 'input stream exceeded maximum bytes of [107374182]'}}}}}}}}

Where 'reason': 'input stream exceeded maximum bytes of [107374182] is the interesting part.

This change brings that reason to the top and provides a more helpful message

"Cannot parse model definition as the content is larger than the maximum stream size of [X] bytes. Max stream size is 10% of the JVM heap or 1GB whichever is smallest"

@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@benwtrent benwtrent self-requested a review July 7, 2020 16:06
@davidkyle davidkyle merged commit 6bc3d07 into elastic:master Jul 8, 2020
@davidkyle davidkyle deleted the better-max-buff-error branch July 8, 2020 07:34
davidkyle added a commit to davidkyle/elasticsearch that referenced this pull request Jul 8, 2020
…lastic#59166)

The actual cause can be lost in a long list of parse exceptions
this surfaces the cause when the problem is size.
davidkyle added a commit that referenced this pull request Jul 9, 2020
…59166) (#59209)

The actual cause can be lost in a long list of parse exceptions
this surfaces the cause when the problem is size.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants