Skip to content

[Docs] Consolidate single example into a single line #48904

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 8, 2019

Conversation

orangejulius
Copy link
Contributor

@orangejulius orangejulius commented Nov 7, 2019

The first example of splitting rules for the word_delimiter token filter was spread across two bullet points. This makes it look like they are two separate splitting rules.

The first example of splitting rules for the `word_delimiter` token filter was spread across two bullet points. This makes it look like they are two separate splitting rules.
@orangejulius orangejulius changed the title Consolidate single example into a single line Docs: Consolidate single example into a single line Nov 7, 2019
@orangejulius orangejulius changed the title Docs: Consolidate single example into a single line [Docs] Consolidate single example into a single line Nov 7, 2019
Copy link
Member

@cbuescher cbuescher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @orangejulius, makes sense. Let me run our tests before merging this.

@cbuescher
Copy link
Member

@elasticmachine test this please

@cbuescher cbuescher merged commit ecde671 into elastic:7.4 Nov 8, 2019
cbuescher pushed a commit that referenced this pull request Nov 8, 2019
The first example of splitting rules for the `word_delimiter` token filter was spread across two bullet points. This makes it look like they are two separate splitting rules.
cbuescher pushed a commit that referenced this pull request Nov 8, 2019
The first example of splitting rules for the `word_delimiter` token filter was spread across two bullet points. This makes it look like they are two separate splitting rules.
cbuescher pushed a commit that referenced this pull request Nov 8, 2019
The first example of splitting rules for the `word_delimiter` token filter was spread across two bullet points. This makes it look like they are two separate splitting rules.
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-docs (>docs)

@cbuescher cbuescher added the :Search Relevance/Analysis How text is split into tokens label Nov 8, 2019
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-search (:Search/Analysis)

@orangejulius orangejulius deleted the patch-1 branch June 3, 2021 17:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants