-
Notifications
You must be signed in to change notification settings - Fork 25.2k
using doc_count from one aggregation into another aggregation to calculate additional percentage #8211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @rajeshetty Please ask questions like these in the mailing list, rather than on the github issues. That said, you would have to do this calculation application side today. However when #8110 is merged, you should be able to do things like this. |
when is 8110 going into main release ?. what release would that be ? |
@rajeshetty It is scheduled for elaticsearch 2.0 (see tags of the issue), which I don't see released before at least several months. |
Appreciate your quick response on this. Which mailing list you recommend asking these kind of questions.? - Stackoverflow or somewhere else?. You see your answer about new reducers was very helpful even though i can't use it anywhere in coming months till 2.0 is out, but still knowing that ES is working on it is useful , So which forum would give me such a quick turn around on my queries around questions I might have ? |
Lets say I have doc like this
In here location_verification is an array of all the friends who have verified the location and it will always be equal or less than location_shared_with array.
mapping
So I need to calculate the % verified per location array . I have created right "nested" mapping between main doc (user) and location , location and location_verification. I have terms aggregation that tells me
when I execute
i do get following
What I want to do is calculate % verified per location .
Formula = [ (location_verification_count -> doc_count / location_shared_with_count->doc_count) x 100 ].
Question is
The text was updated successfully, but these errors were encountered: