You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the Elasticsearch output in Logstash does not allow the creation of a custom data stream type, if you use the data_stream_* settings of the output it will validate the data_stream_type and it will only allow the following values:
logs
metrics
synthetics
traces
All of those types are also used by Elastic Agent and have system managed templates and lifecycle policies, so to use data stream now in logstash you would need to create some template for the type you want but make sure that this template will not override the system templates, this makes things more complex and there is always the risk of human error that would override the templates used by Elastic Agent and break things.
To be able to use custom data streams in logstash you need a trick on the output like the example below:
migrate config option's validation to a regexp like \A(?!\.{1,2}$)[[:lower:][:digit:]][[:lower:][:digit:]\._+]{0,252}\Z the successfully rejects known-invalid index prefixes while letting likely-valid ones through (limitation: composed index name length cannot be validated solely from a single component).
add a validator to Validator Support mixin that does the same as (1) more readably/efficiently
validate the composed index name for data stream in #initialize or #register if and only if data_stream is effectivelytrue
Currently Logstash is only capable of create data streams that follows the elastic naming scheme, <type>-<dataset>-<namespace>, so with data_stream as true, the user is limited to create data stream using one of the available types and also needs to provide a dataset and a namespace.
In my case my data streams follows another naming pattern, I do not use type or namespace, some data streams have a prefix in the name and others are just the dataset name.
So the configuration I shared works because logstash just send the request and the index is created as data stream because it is defined in the index template.
Being honest, just having these steps in the documentation could be enough.
Hello,
Currently the Elasticsearch output in Logstash does not allow the creation of a custom data stream type, if you use the
data_stream_*
settings of the output it will validate thedata_stream_type
and it will only allow the following values:logs
metrics
synthetics
traces
All of those types are also used by Elastic Agent and have system managed templates and lifecycle policies, so to use data stream now in logstash you would need to create some template for the type you want but make sure that this template will not override the system templates, this makes things more complex and there is always the risk of human error that would override the templates used by Elastic Agent and break things.
To be able to use custom data streams in logstash you need a trick on the output like the example below:
While this works, Logstash should allow the creation of data streams of custom types, which is not possible now.
The text was updated successfully, but these errors were encountered: