Skip to content

[DOCS] Add data streams to bulk, delete, and index API docs #58340

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jun 23, 2020
Merged

[DOCS] Add data streams to bulk, delete, and index API docs #58340

merged 7 commits into from
Jun 23, 2020

Conversation

jrodewig
Copy link
Contributor

Updates existing docs for the bulk, delete and index APIs to make them
aware of data streams.

Updates existing docs for the bulk, delete and index APIs to make them
aware of data streams.
@jrodewig jrodewig mentioned this pull request Jun 18, 2020
76 tasks
@jrodewig jrodewig added :Data Management/Data streams Data streams and their lifecycles >docs General docs changes v7.9.0 v8.0.0 labels Jun 18, 2020
@jrodewig jrodewig requested review from danhermann and martijnvg June 18, 2020 17:04
@jrodewig jrodewig marked this pull request as ready for review June 18, 2020 17:04
Copy link
Contributor

@danhermann danhermann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One minor and one more significant comment below:

Comment on lines 330 to 331
Name of the index associated with the operation. If the operation targeted a
data stream, this is the associated backing index.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor: I wonder if it would be more clear to say something along the lines of "the backing index into which the document was written" instead of "associated backing index" since a data stream may have many associated backing indices.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I used this wording here and a modified version for failed operations.

@@ -4,18 +4,19 @@
<titleabbrev>Delete</titleabbrev>
++++

Removes a JSON document from the specified index.
Removes a JSON document from a specified data stream or index.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We disallow deletes directly on a data stream because it would require searching every backing index for the specified ID. If a document is to be deleted from a data stream, the relevant backing index must be targeted directly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:doh: Thanks for catching this. I think I got overzealous.

Fixed up here: https://github.com/elastic/elasticsearch/pull/58340/files#diff-907ef82eb2dfb22151e4312359f5aee4

@jrodewig jrodewig requested a review from danhermann June 18, 2020 19:45
Copy link
Contributor

@danhermann danhermann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks, @jrodewig!

@jrodewig jrodewig merged commit 48f4a8d into elastic:master Jun 23, 2020
@jrodewig jrodewig deleted the docs__make-index-delete-ds-aware branch June 23, 2020 13:18
jrodewig added a commit that referenced this pull request Jun 23, 2020
…58434)

Updates existing docs for the bulk, delete and index APIs to make them
aware of data streams.
@jrodewig
Copy link
Contributor Author

Backport commits

master 48f4a8d
7.x afbf3bd

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:Data Management/Data streams Data streams and their lifecycles >docs General docs changes v7.9.0 v8.0.0-alpha1
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants