Skip to content

[DOCS] Adds GET, GET stats and DELETE inference APIs #50224

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Dec 18, 2019
59 changes: 59 additions & 0 deletions docs/reference/ml/df-analytics/apis/delete-trained-model.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
[role="xpack"]
[testenv="platinum"]
[[delete-inference]]
=== Delete trained model API
[subs="attributes"]
++++
<titleabbrev>Delete trained model</titleabbrev>
++++

Deletes an existing trained {dfanalytics} model that is currently not referenced
by an ingest pipeline.

experimental[]


[[ml-delete-inference-request]]
==== {api-request-title}

`DELETE _ml/inference/<model_id>`


[[ml-delete-inference-prereq]]
==== {api-prereq-title}

* You must have `machine_learning_admin` built-in role to use this API. For more
information, see <<security-privileges>> and <<built-in-roles>>.


[[ml-delete-inference-path-params]]
==== {api-path-parms-title}

`<model_id>`::
(Optional, string)
include::{docdir}/ml/ml-shared.asciidoc[tag=model-id]


[[ml-delete-inference-example]]
==== {api-examples-title}

The following example deletes the `regression-job-one-1574775307356` trained
model:

[source,console]
--------------------------------------------------
DELETE _ml/inference/regression-job-one-1574775307356
--------------------------------------------------
// TEST[skip:TBD]

The API returns the following result:


[source,console-result]
----
{
"acknowledged" : true
}
----


134 changes: 134 additions & 0 deletions docs/reference/ml/df-analytics/apis/get-trained-model-stats.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
[role="xpack"]
[testenv="platinum"]
[[get-inference-stats]]
=== Get trained model statistics API
[subs="attributes"]
++++
<titleabbrev>Get trained model stats</titleabbrev>
++++

Retrieves usage information for trained {dfanalytics} models.

experimental[]


[[ml-get-inference-stats-request]]
==== {api-request-title}

`GET _ml/inference/_stats` +

`GET _ml/inference/_all/_stats` +

`GET _ml/inference/<model_id>/_stats` +

`GET _ml/inference/<model_id>,<model_id_2>/_stats` +

`GET _ml/inference/<model_id_pattern*>,<model_id_2>/_stats`


[[ml-get-inference-stats-prereq]]
==== {api-prereq-title}

* You must have `monitor_ml` privilege to use this API. For more information,
see <<security-privileges>> and <<built-in-roles>>.


[[ml-get-inference-stats-desc]]
==== {api-description-title}

You can get iusage nformation for multiple trained models in a single API
request by using a comma-separated list of model IDs or a wildcard expression.


[[ml-get-inference-stats-path-params]]
==== {api-path-parms-title}

`<model_id>`::
(Optional, string)
include::{docdir}/ml/ml-shared.asciidoc[tag=model-id]


[[ml-get-inference-stats-query-params]]
==== {api-query-parms-title}

`allow_no_match`::
(Optional, boolean)
include::{docdir}/ml/ml-shared.asciidoc[tag=allow-no-match]

`from`::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=from]

`size`::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=size]


[[ml-get-inference-stats-response-codes]]
==== {api-response-codes-title}

`404` (Missing resources)::
If `allow_no_match` is `false`, this code indicates that there are no
resources that match the request or only partial matches for the request.


[[ml-get-inference-stats-example]]
==== {api-examples-title}

The following example gets usage information for all the trained models:

[source,console]
--------------------------------------------------
GET _ml/inference/_stats
--------------------------------------------------
// TEST[skip:TBD]


The API returns the following results:

[source,console-result]
----
{
"count": 2,
"trained_model_stats": [
{
"model_id": "flight-delay-prediction-1574775339910",
"pipeline_count": 0
},
{
"model_id": "regression-job-one-1574775307356",
"pipeline_count": 1,
"ingest": {
"total": {
"count": 178,
"time_in_millis": 8,
"current": 0,
"failed": 0
},
"pipelines": {
"flight-delay": {
"count": 178,
"time_in_millis": 8,
"current": 0,
"failed": 0,
"processors": [
{
"inference": {
"type": "inference",
"stats": {
"count": 178,
"time_in_millis": 7,
"current": 0,
"failed": 0
}
}
}
]
}
}
}
}
]
}
----
// NOTCONSOLE
92 changes: 92 additions & 0 deletions docs/reference/ml/df-analytics/apis/get-trained-model.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
[role="xpack"]
[testenv="platinum"]
[[get-inference]]
=== Get trained model API
[subs="attributes"]
++++
<titleabbrev>Get trained model</titleabbrev>
++++

Retrieves configuration information for a trained {dfanalytics} model.

experimental[]


[[ml-get-inference-request]]
==== {api-request-title}

`GET _ml/inference/` +

`GET _ml/inference/<model_id>` +

`GET _ml/inference/_all` +

`GET _ml/inference/<model_id1>,<model_id2>` +

`GET _ml/inference/<model_id_pattern*>`


[[ml-get-inference-prereq]]
==== {api-prereq-title}

* You must have `monitor_ml` privilege to use this API. For more information,
see <<security-privileges>> and <<built-in-roles>>.


[[ml-get-inference-desc]]
==== {api-description-title}

You can get information for multiple trained models in a single API request by
using a comma-separated list of model IDs or a wildcard expression.


[[ml-get-inference-path-params]]
==== {api-path-parms-title}

`<model_id>`::
(Optional, string)
include::{docdir}/ml/ml-shared.asciidoc[tag=model-id]


[[ml-get-inference-query-params]]
==== {api-query-parms-title}

`allow_no_match`::
(Optional, boolean)
include::{docdir}/ml/ml-shared.asciidoc[tag=allow-no-match]

`decompress_definition`::
(Optional, boolean)
include::{docdir}/ml/ml-shared.asciidoc[tag=decompress-definition]

`from`::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=from]

`include_model_definition`::
(Optional, boolean)
include::{docdir}/ml/ml-shared.asciidoc[tag=include-model-definition]

`size`::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=size]


[[ml-get-inference-response-codes]]
==== {api-response-codes-title}

`404` (Missing resources)::
If `allow_no_match` is `false`, this code indicates that there are no
resources that match the request or only partial matches for the request.


[[ml-get-inference-example]]
==== {api-examples-title}

The following example gets configuration information for all the trained models:

[source,console]
--------------------------------------------------
GET _ml/inference/
--------------------------------------------------
// TEST[skip:TBD]
7 changes: 7 additions & 0 deletions docs/reference/ml/df-analytics/apis/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ You can use the following APIs to perform {ml} {dfanalytics} activities.
* <<stop-dfanalytics,Stop {dfanalytics-jobs}>>
* <<evaluate-dfanalytics,Evaluate {dfanalytics}>>
* <<explain-dfanalytics,Explain {dfanalytics}>>
* <<get-inference>>
* <<get-inference-stats>>
* <<delete-inference>>

For the `analysis` object resources, check <<ml-dfa-analysis-objects>>.

Expand All @@ -32,3 +35,7 @@ include::get-dfanalytics-stats.asciidoc[]
//SET/START/STOP
include::start-dfanalytics.asciidoc[]
include::stop-dfanalytics.asciidoc[]
//INFERENCE
include::get-trained-model.asciidoc[]
include::get-trained-model-stats.asciidoc[]
include::delete-trained-model.asciidoc[]
15 changes: 15 additions & 0 deletions docs/reference/ml/ml-shared.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -483,6 +483,11 @@ Identifier for the {dfeed}. It can be a {dfeed} identifier or a wildcard
expression.
end::datafeed-id-wildcard[]

tag::decompress-definition[]
Specifies if the included model definition should be returned as a JSON map or
in a custom compressed format. Defaults to `true`.
end::decompress-definition[]

tag::delayed-data-check-config[]
Specifies whether the {dfeed} checks for missing data and the size of the
window. For example: `{"enabled": true, "check_window": "1h"}`.
Expand Down Expand Up @@ -688,6 +693,12 @@ tag::groups[]
A list of job groups. A job can belong to no groups or many.
end::groups[]

tag::include-model-definition[]
Specifies if the model definition should be returned in the response. Defaults
to `false`. When `true`, only a single model must match the ID patterns
provided, otherwise a bad request is returned.
end::include-model-definition[]

tag::indices[]
An array of index names. Wildcards are supported. For example:
`["it_ops_metrics", "server*"]`.
Expand Down Expand Up @@ -828,6 +839,10 @@ recommended value.
--
end::mode[]

tag::model-id[]
The unique identifier of the trained {dfanalytics} model.
end::model-id[]

tag::model-memory-limit[]
The approximate maximum amount of memory resources that are required for
analytical processing. Once this limit is approached, data pruning becomes
Expand Down