Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Timelion search ignores time range when choosing indices #10475

Closed
pjcard opened this issue Feb 21, 2017 · 19 comments
Closed

Timelion search ignores time range when choosing indices #10475

pjcard opened this issue Feb 21, 2017 · 19 comments
Labels
bug Fixes for quality problems that affect the customer experience Feature:Timelion Timelion app and visualization Feature:Visualizations Generic visualization features (in case no more specific feature label is available)

Comments

@pjcard
Copy link

pjcard commented Feb 21, 2017

Kibana version:
Version: 5.1.1 Build 14566, Commit SHA 85a6f4d

Elasticsearch version:
5.1.1

Browser version:
Chrome 55.0.2883.87

Browser OS version:
Windows 10

Description of the problem including expected versus actual behavior:
When using timelion all indices matching the expression are queried irrespective of the time range setting.

This is an issue because:

  1. Indices no longer have to use a pattern to describe the time range they contain, meaning there might be no expression which can be used to correctly restrict the indices which are queried.
  2. In the case where (1) doesn't apply, the expression would need to be updated manually when changing the time range. This would preclude use of the timelion chart within a dashboard.

Steps to reproduce:

  1. Navigate to the default view of timelion, it will have a time range of 15 minutes and an expression of .es( * )
  2. Note that in fact all indices matching the pattern (in this case, *) are queried, rather than just those containing the time range specified.

My assumption would be that timelion is not using the field stats api described here:
#4342

Errors in browser console (if relevant):
An image showing an error produced from querying far more shards than should have been for the short, 15 minute, interval.

image

Provide logs and/or server output (if relevant):
The logs for the above request, note the number of shards hit and the value of the date histogram's
extended bounds.

Please ignore the indexing strategy, it only serves to illustrate the more general issue.

index_search_slowlog.log.txt

This issue was previously reported here:
elastic/timelion#195

@thomasneirynck thomasneirynck added Feature:Timelion Timelion app and visualization Feature:Visualizations Generic visualization features (in case no more specific feature label is available) bug Fixes for quality problems that affect the customer experience labels Feb 21, 2017
@pjcard
Copy link
Author

pjcard commented Feb 22, 2017

Thanks @thomasneirynck, much appreciated.

@leemon9527
Copy link

any help there?i'm having the same issue

@boverhof
Copy link

Same issue. Can't use the tool at all.

@Misakiri
Copy link

Any news? Same issue for me too

@rhoboat rhoboat changed the title Timelion search ignores time range when choosing indicies Timelion search ignores time range when choosing indices May 1, 2017
@schmod
Copy link

schmod commented Jul 14, 2017

Having a similar issue here. It's impossible for me to put timelion queries on a dashboard because it's incredibly easy to generate 1000s of search requests with some fairly simple (time-constrained) queries.

@stevehedrick
Copy link

I've had to remove these from our dashboards as well. We're looking at a timeframe that should only be searching 24 shards, and yet timelion is hitting 1004 shards because it's searching our entire logging history.

It's sad because we really did like the graphs that it creates.

@amansehgal-git
Copy link

+1

1 similar comment
@StephenGoodall
Copy link

+1

@thomasneirynck thomasneirynck removed their assignment Sep 5, 2017
@slamminFunkFace
Copy link

So when will this be fixed?

@pjcard
Copy link
Author

pjcard commented Nov 1, 2017

Unless I'm misunderstanding, the resolution of the similar issue I reported for Kibana (#14633) suggests that this should become less of an issue in ES 5.6.

@timroes
Copy link
Contributor

timroes commented Apr 18, 2018

As @pjcard already posted, as of ES 5.6 we don't do index pattern expansion on times, since ES does the linked optimization now internally. So this issue is no longer relevant, since all visualizations now just query an index pattern and let ES take care of filtering out inappropriate time range shards.

I will close this, but please feel free to leave a comment, if you've still experience issues with Kibana 5.6 upwards. Thanks for your patience and waiting for ES providing the proper solution for that issue.

@timroes timroes closed this as completed Apr 18, 2018
@pjcard
Copy link
Author

pjcard commented Apr 18, 2018

One thing that didn't occur to me when I made that comment: how does this relate to queue sizes? One issue we have specifically with this is that it meant we were getting failures due to the query hitting shards outside of the time range we specified. So, for instance, if we have one index per day, three shards per index and searched over a week's worth of data, we shouldn't hit any queue limits, but in fact we did hit them because of all the indices that were being queried needlessly.

We did up our queue size, against recommendation, because there appeared to be nothing else we could do:
https://www.elastic.co/blog/why-am-i-seeing-bulk-rejections-in-my-elasticsearch-cluster

Any feedback you have on this @timroes would be greatly appreciated. If hitting these pointless indices is still filling up queues, then our queries are still being needlessly limited by data we're purposely trying to exclude, and this bug would still need addressing.

@pjcard
Copy link
Author

pjcard commented Apr 18, 2018

To be more specific, see @robin13's suggestion here for avoiding shard fetch failures (#3221)
"Ensure the query hits less indices/shards by better time range filters?"
That was the original issue promoting this bug, we were getting these failures despite trying to follow this advice.

@bleskes
Copy link
Contributor

bleskes commented Apr 18, 2018

@pjcard this may take some more investigation. I suggest you open up an discussion on our forums so we help out. We keep github for issues and features requests. Please make sure to use the Elasticsearch forum as that is what your question is about.

@timroes
Copy link
Contributor

timroes commented Apr 18, 2018

Please feel free to leave the link to the discuss post here, for cross reference to other users might be able to read up on that topic.

@pjcard
Copy link
Author

pjcard commented Apr 18, 2018

@bleskes I'm not sure what you mean, it must either still cause shard failures, in which case the bug is still valid and should be reopened, or it will not, in which case there is nothing further to discuss.

Edit: I've just checked, and we're still on 5.5.3, so I can't help retest the scenario. It should be fairly trivial, though.

@clintongormley
Copy link
Contributor

@pjcard The fast shard pre-filtering (see elastic/elasticsearch#25658) does not use the search queue and so are not subject to rejection. They also don't fill the queues and so cause other searches to be rejected.

On top of that, elastic/elasticsearch#25632 limits the number of concurrent shard-level search request that can be sent per search, so that a single search can't dominate the cluster.

@pjcard
Copy link
Author

pjcard commented Apr 18, 2018

@clintongormley Ah, thank you for the clarification, much appreciated. I will keep pushing my guys to upgrade then, and it sounds like @timroes was spot on in resolving it.

@bleskes
Copy link
Contributor

bleskes commented Apr 19, 2018

For future readers - the Elasticsearch limit that causes the error mentioned in this ticket has been removed in 5.4.0 due to the changes @clintongormley mentioned: elastic/elasticsearch#24012

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Feature:Timelion Timelion app and visualization Feature:Visualizations Generic visualization features (in case no more specific feature label is available)
Projects
None yet
Development

No branches or pull requests