Skip to content

[ML] Make warnings from inference errors #81735

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Dec 15, 2021
Merged

Conversation

davidkyle
Copy link
Member

Consistently using exceptions for errors instead of WarningInferenceResults. Using a mixture of the 2 complicates debugging and triaging as more both warnings and exceptions have to be checked.

Many of the uses of WarningInferenceResults related to validating the output of the native pytorch process. Those unexpected results are more accurately internal server errors not a warning.

Follow up to #81475

@elasticmachine elasticmachine added the Team:ML Meta label for the ML team label Dec 14, 2021
@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

Copy link
Member

@benwtrent benwtrent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to update the PyTorchModelIT to make sure that no warnings are written.

@@ -371,32 +369,17 @@ protected void doRun() throws Exception {
processContext,
request.tokenization,
processor.getResultProcessor((NlpConfig) config),
ActionListener.wrap(this::onSuccess, f -> handleFailure(f, this))
ActionListener.wrap(this::onSuccess, this::onFailure)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
ActionListener.wrap(this::onSuccess, this::onFailure)
this

@@ -92,7 +91,7 @@ public void testProcessResults_GivenMissingTokens() {
PyTorchResult pyTorchResult = new PyTorchResult("1", new double[][][] { { {} } }, 0L, null);
assertThat(
FillMaskProcessor.processResult(tokenization, pyTorchResult, tokenizer, 5, randomAlphaOfLength(10)),
instanceOf(WarningInferenceResults.class)
instanceOf(ElasticsearchStatusException.class)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should probably be expectThrows

@benwtrent
Copy link
Member

PyTorchModelIT#testPipelineWithBadProcessor

Needs to be updated as it expects a warning on missing field and we want that to be an exception right?

@davidkyle davidkyle merged commit d00f414 into elastic:master Dec 15, 2021
@elasticsearchmachine
Copy link
Collaborator

💔 Backport failed

Status Branch Result
8.0 Commit could not be cherrypicked due to conflicts

You can use sqren/backport to manually backport by running backport --upstream elastic/elasticsearch --pr 81735

davidkyle added a commit to davidkyle/elasticsearch that referenced this pull request Dec 15, 2021
Consistently using exceptions for errors instead of WarningInferenceResults
to simplify debugging/triaging
# Conflicts:
#	x-pack/plugin/ml/qa/native-multi-node-tests/src/javaRestTest/java/org/elasticsearch/xpack/ml/integration/PyTorchModelIT.java
#	x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/inference/nlp/FillMaskProcessor.java
#	x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/inference/nlp/NerProcessor.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/FillMaskProcessorTests.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/NerProcessorTests.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/TextClassificationProcessorTests.java
elasticsearchmachine pushed a commit that referenced this pull request Dec 15, 2021
Consistently using exceptions for errors instead of WarningInferenceResults
to simplify debugging/triaging
# Conflicts:
#	x-pack/plugin/ml/qa/native-multi-node-tests/src/javaRestTest/java/org/elasticsearch/xpack/ml/integration/PyTorchModelIT.java
#	x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/inference/nlp/FillMaskProcessor.java
#	x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/inference/nlp/NerProcessor.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/FillMaskProcessorTests.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/NerProcessorTests.java
#	x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/inference/nlp/TextClassificationProcessorTests.java
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>bug :ml Machine learning Team:ML Meta label for the ML team v8.0.0-rc1 v8.1.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants