Skip to content

[TSDB] Removed summary and histogram metric types #89937

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Sep 9, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 6 additions & 20 deletions docs/reference/data-streams/tsds.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ A TSDS document is uniquely identified by its time series and timestamp, both of
which are used to generate the document `_id`. So, two documents with the same
dimensions and the same timestamp are considered to be duplicates. When you use
the `_bulk` endpoint to add documents to a TSDS, a second document with the same
timestamp and dimensions overwrites the first. When you use the
timestamp and dimensions overwrites the first. When you use the
`PUT /<target>/_create/<_id>` format to add an individual document and a document
with the same `_id` already exists, an error is generated.

Expand All @@ -105,16 +105,16 @@ parameter:
* <<number,`long`>>
* <<number,`unsigned_long`>>

[[dimension-limits]]
.Dimension limits
****
[[dimension-limits]]
.Dimension limits
****
In a TSDS, {es} uses dimensions to
generate the document `_id` and <<tsid,`_tsid`>> values. The resulting `_id` is
always a short encoded hash. To prevent the `_tsid` value from being overly
large, {es} limits the number of dimensions for an index using the
<<index-mapping-dimension-fields-limit,`index.mapping.dimension_fields.limit`>>
index setting. While you can increase this limit, the resulting document `_tsid`
value can't exceed 32KB.
value can't exceed 32KB.
****

[discrete]
Expand Down Expand Up @@ -157,20 +157,6 @@ available disk space.
Only numeric and `aggregate_metric_double` fields support the `gauge` metric
type.

// tag::time-series-metric-histogram[]
`histogram`:: A pair of numeric arrays that measure the distribution of values
across predefined buckets. For example, server response times by percentile.
// end::time-series-metric-histogram[]
+
Only `histogram` fields support the `histogram` metric type.

// tag::time-series-metric-summary[]
`summary`:: An array of aggregated values, such as `sum`, `avg`, `value_count`,
`min`, and `max`.
// end::time-series-metric-summary[]
+
Only `aggregate_metric_double` fields support the `gauge` metric type.

// tag::time-series-metric-null[]
`null` (Default):: Not a time series metric.
// end::time-series-metric-null[]
Expand Down Expand Up @@ -303,4 +289,4 @@ Now that you know the basics, you're ready to <<set-up-tsds,create a TSDS>> or
<<set-up-tsds,convert an existing data stream to a TSDS>>.

include::set-up-tsds.asciidoc[]
include::tsds-index-settings.asciidoc[]
include::tsds-index-settings.asciidoc[]
3 changes: 0 additions & 3 deletions docs/reference/mapping/types/aggregate-metric-double.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -64,12 +64,9 @@ include::numeric.asciidoc[tag=time_series_metric]
.Valid `time_series_metric` values for `aggregate_metric_double` fields
[%collapsible%open]
====
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-counter]

include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-gauge]

include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-summary]

include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-null]
====

Expand Down
17 changes: 0 additions & 17 deletions docs/reference/mapping/types/histogram.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -24,23 +24,6 @@ per document. Nested arrays are not supported.
========

[role="child_attributes"]
[[histogram-params]]
==== Parameters

ifeval::["{release-state}"=="unreleased"]
`time_series_metric`::
preview:[] (Optional, string)
include::numeric.asciidoc[tag=time_series_metric]
+
.Valid `time_series_metric` values for `histogram` fields
[%collapsible%open]
====
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-histogram]

include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-null]
====
endif::[]

[[histogram-uses]]
==== Uses

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,7 @@ private TimeSeriesParams() {}

public enum MetricType {
gauge(new String[] { "max", "min", "value_count", "sum" }),
counter(new String[] { "last_value" }),
histogram(new String[] { "value_count" }), // TODO Add more aggs
summary(new String[] { "value_count", "sum", "min", "max" });
counter(new String[] { "last_value" });

private final String[] supportedAggs;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,6 @@
import org.elasticsearch.index.mapper.SourceLoader;
import org.elasticsearch.index.mapper.SourceValueFetcher;
import org.elasticsearch.index.mapper.TextSearchInfo;
import org.elasticsearch.index.mapper.TimeSeriesParams;
import org.elasticsearch.index.mapper.ValueFetcher;
import org.elasticsearch.index.query.SearchExecutionContext;
import org.elasticsearch.script.field.DocValuesScriptFieldFactory;
Expand Down Expand Up @@ -74,12 +73,6 @@ public static class Builder extends FieldMapper.Builder {
private final Parameter<Map<String, String>> meta = Parameter.metaParam();
private final Parameter<Explicit<Boolean>> ignoreMalformed;

/**
* Parameter that marks this field as a time series metric defining its time series metric type.
* For {@link HistogramFieldMapper} fields only the histogram metric type is supported.
*/
private final Parameter<TimeSeriesParams.MetricType> metric;

public Builder(String name, boolean ignoreMalformedByDefault) {
super(name);
this.ignoreMalformed = Parameter.explicitBoolParam(
Expand All @@ -88,20 +81,18 @@ public Builder(String name, boolean ignoreMalformedByDefault) {
m -> toType(m).ignoreMalformed,
ignoreMalformedByDefault
);

this.metric = TimeSeriesParams.metricParam(m -> toType(m).metricType, TimeSeriesParams.MetricType.histogram);
}

@Override
protected Parameter<?>[] getParameters() {
return new Parameter<?>[] { ignoreMalformed, meta, metric };
return new Parameter<?>[] { ignoreMalformed, meta };
}

@Override
public HistogramFieldMapper build(MapperBuilderContext context) {
return new HistogramFieldMapper(
name,
new HistogramFieldType(context.buildFullName(name), meta.getValue(), metric.getValue()),
new HistogramFieldType(context.buildFullName(name), meta.getValue()),
multiFieldsBuilder.build(this, context),
copyTo.build(),
this
Expand All @@ -117,9 +108,6 @@ public HistogramFieldMapper build(MapperBuilderContext context) {
private final Explicit<Boolean> ignoreMalformed;
private final boolean ignoreMalformedByDefault;

/** The metric type (gauge, counter, summary) if field is a time series metric */
private final TimeSeriesParams.MetricType metricType;

public HistogramFieldMapper(
String simpleName,
MappedFieldType mappedFieldType,
Expand All @@ -130,7 +118,6 @@ public HistogramFieldMapper(
super(simpleName, mappedFieldType, multiFields, copyTo);
this.ignoreMalformed = builder.ignoreMalformed.getValue();
this.ignoreMalformedByDefault = builder.ignoreMalformed.getDefaultValue().value();
this.metricType = builder.metric.getValue();
}

boolean ignoreMalformed() {
Expand All @@ -154,11 +141,8 @@ protected void parseCreateField(DocumentParserContext context) {

public static class HistogramFieldType extends MappedFieldType {

private final TimeSeriesParams.MetricType metricType;

public HistogramFieldType(String name, Map<String, String> meta, TimeSeriesParams.MetricType metricType) {
public HistogramFieldType(String name, Map<String, String> meta) {
super(name, false, false, true, TextSearchInfo.NONE, meta);
this.metricType = metricType;
}

@Override
Expand Down Expand Up @@ -262,14 +246,6 @@ public Query termQuery(Object value, SearchExecutionContext context) {
"[" + CONTENT_TYPE + "] field do not support searching, " + "use dedicated aggregations instead: [" + name() + "]"
);
}

/**
* If field is a time series metric field, returns its metric type
* @return the metric type or null
*/
public TimeSeriesParams.MetricType getMetricType() {
return metricType;
}
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType(String fieldName) {
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap());
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -373,7 +373,7 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy

private MappedFieldType defaultFieldType(String fieldName) {
if (fieldName.equals(HISTO_FIELD_NAME)) {
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap());
} else {
return new NumberFieldMapper.NumberFieldType(fieldName, NumberFieldMapper.NumberType.DOUBLE);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ public void testSimple() throws IOException {
PercentileRanksAggregationBuilder aggBuilder = new PercentileRanksAggregationBuilder("my_agg", new double[] { 0.1, 0.5, 12 })
.field("field")
.method(PercentilesMethod.HDR);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
try (IndexReader reader = w.getReader()) {
IndexSearcher searcher = new IndexSearcher(reader);
PercentileRanks ranks = searchAndReduce(searcher, new MatchAllDocsQuery(), aggBuilder, fieldType);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ private void testCase(Query query, CheckedConsumer<RandomIndexWriter, IOExceptio
PercentilesAggregationBuilder builder = new PercentilesAggregationBuilder("test").field("number")
.method(PercentilesMethod.HDR);

MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap());
Aggregator aggregator = createAggregator(builder, indexSearcher, fieldType);
aggregator.preCollection();
indexSearcher.search(query, aggregator.asCollector());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,6 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedAvgAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedAvgAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,6 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMaxAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMaxAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,6 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMinAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMinAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,6 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedSumAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedSumAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -140,6 +140,6 @@ protected AggregationBuilder createAggBuilderForTypeTest(MappedFieldType fieldTy
}

private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ public void testSimple() throws IOException {
PercentileRanksAggregationBuilder aggBuilder = new PercentileRanksAggregationBuilder("my_agg", new double[] { 0.1, 0.5, 12 })
.field("field")
.method(PercentilesMethod.TDIGEST);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
try (IndexReader reader = w.getReader()) {
IndexSearcher searcher = new IndexSearcher(reader);
PercentileRanks ranks = searchAndReduce(searcher, new MatchAllDocsQuery(), aggBuilder, fieldType);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ private void testCase(
PercentilesAggregationBuilder builder = new PercentilesAggregationBuilder("test").field("number")
.method(PercentilesMethod.TDIGEST);

MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap());
Aggregator aggregator = createAggregator(builder, indexSearcher, fieldType);
aggregator.preCollection();
indexSearcher.search(query, aggregator.asCollector());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MapperParsingException;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.MapperTestCase;
import org.elasticsearch.index.mapper.ParsedDocument;
import org.elasticsearch.index.mapper.SourceToParse;
Expand Down Expand Up @@ -318,37 +317,6 @@ public void testCannotBeUsedInMultifields() {
assertThat(e.getMessage(), containsString("Field [hist] of type [histogram] can't be used in multifields"));
}

public void testMetricType() throws IOException {
// Test default setting
MapperService mapperService = createMapperService(fieldMapping(b -> minimalMapping(b)));
HistogramFieldMapper.HistogramFieldType ft = (HistogramFieldMapper.HistogramFieldType) mapperService.fieldType("field");
assertNull(ft.getMetricType());
assertMetricType("histogram", HistogramFieldMapper.HistogramFieldType::getMetricType);

{
// Test invalid metric type for this field type
Exception e = expectThrows(MapperParsingException.class, () -> createMapperService(fieldMapping(b -> {
minimalMapping(b);
b.field("time_series_metric", "gauge");
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [gauge] for field [time_series_metric] - accepted values are [histogram]")
);
}
{
// Test invalid metric type for this field type
Exception e = expectThrows(MapperParsingException.class, () -> createMapperService(fieldMapping(b -> {
minimalMapping(b);
b.field("time_series_metric", "unknown");
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [unknown] for field [time_series_metric] - accepted values are [histogram]")
);
}
}

@Override
protected IngestScriptSupport ingestScriptSupport() {
throw new AssumptionViolatedException("not supported");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -719,7 +719,7 @@ public void testFormatter() throws IOException {
}

public void testHistogramFieldMonthToMonth() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");
if (randomBoolean()) {
Expand All @@ -742,7 +742,7 @@ public void testHistogramFieldMonthToMonth() throws IOException {
}

public void testHistogramFieldMonthToYear() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");
if (randomBoolean()) {
Expand All @@ -762,7 +762,7 @@ public void testHistogramFieldMonthToYear() throws IOException {
}

public void testHistogramFieldMonthToMonthValueCount() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month")
.rateMode("value_count")
Expand All @@ -784,7 +784,7 @@ public void testHistogramFieldMonthToMonthValueCount() throws IOException {
}

public void testHistogramFieldMonthToYearValueCount() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month")
.rateMode("value_count")
Expand All @@ -805,7 +805,7 @@ public void testHistogramFieldMonthToYearValueCount() throws IOException {
}

public void testFilterWithHistogramField() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
MappedFieldType keywordType = new KeywordFieldMapper.KeywordFieldType("term");
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");
Expand Down
4 changes: 4 additions & 0 deletions x-pack/plugin/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,10 @@ tasks.named("yamlRestTestV7CompatTest").configure {
'unsigned_long/50_script_values/script_score query',
'unsigned_long/50_script_values/Script query',
'data_stream/140_data_stream_aliases/Fix IndexNotFoundException error when handling remove alias action',
'aggregate-metrics/90_tsdb_mappings/aggregate_double_metric with time series mappings',
'aggregate-metrics/90_tsdb_mappings/aggregate_double_metric with wrong time series mappings',
'analytics/histogram/histogram with wrong time series mappings',
'analytics/histogram/histogram with time series mappings',
].join(',')
}

Expand Down
Loading