Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

query: Some Thanos Queries hang forever #919

Closed
povilasv opened this issue Mar 13, 2019 · 5 comments
Closed

query: Some Thanos Queries hang forever #919

povilasv opened this issue Mar 13, 2019 · 5 comments

Comments

@povilasv
Copy link
Member

povilasv commented Mar 13, 2019

Thanos, Prometheus and Golang version used*

      containers:
      - name: thanos-query
        image: improbable/thanos:master-2019-03-12-910d438
        args:
        - query
        - --log.level=debug
        - --query.replica-label=replica
        - --store.sd-files=/etc/thanos/store-sd.yaml
        - --query.partial-response
        - --query.auto-downsampling
        - --cluster.disable
        - --query.timeout=10s

What happened

For some queries Thanos query just doesn't respond and keeps the connection hanging.

k port-forward thanos-query-849c9bc7fc-qtdv6 10902
level=info ts=2019-03-13T13:49:07.512593252Z caller=flags.go:87 msg="gossip is disabled"
level=info ts=2019-03-13T13:49:07.51565412Z caller=main.go:257 component=query msg="disabled TLS, key and cert must be set to enable"
level=info ts=2019-03-13T13:49:07.515694976Z caller=query.go:468 msg="starting query node"
level=info ts=2019-03-13T13:49:07.51582857Z caller=query.go:437 msg="Listening for query and metrics" address=0.0.0.0:10902
level=info ts=2019-03-13T13:49:07.515834613Z caller=query.go:460 component=query msg="Listening for StoreAPI gRPC" address=0.0.0.0:10901
level=info ts=2019-03-13T13:49:37.527189874Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527236509Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527253173Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527263894Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527273766Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527285924Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.energy:10901
level=info ts=2019-03-13T13:49:37.527298467Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.corp-mon:10901
level=info ts=2019-03-13T13:49:37.52731974Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-system-0.thanos-sidecar.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527333155Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527351515Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.telecom:10901
level=info ts=2019-03-13T13:49:37.527362503Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527386106Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-system-1.thanos-sidecar.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527400067Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.energy:10901
level=info ts=2019-03-13T13:49:37.527422702Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.energy:10901
level=info ts=2019-03-13T13:49:37.527452567Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527500935Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527519217Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.energy:10901
level=info ts=2019-03-13T13:49:37.52754057Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.telecom:10901
level=info ts=2019-03-13T13:49:37.527557559Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.telecom:10901
level=info ts=2019-03-13T13:49:37.52757928Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.telecom:10901
level=info ts=2019-03-13T13:49:37.527597595Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527629153Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527647733Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527669375Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.customer-platform:10901
curl -v -m 150 -X GET "http://localhost:10902/api/v1/series?match[]=messages_consumed_total%7Bkubernetes_namespace%3D~%22acs%7Ctelecom%22%7D&start=1552322141&end=1552408541"
Note: Unnecessary use of -X or --request, GET is already inferred.
* Expire in 0 ms for 6 (transfer 0x559f6e53c7b0)
* Expire in 150000 ms for 8 (transfer 0x559f6e53c7b0)
*   Trying ::1...
* TCP_NODELAY set
* Expire in 75000 ms for 3 (transfer 0x559f6e53c7b0)
* Expire in 200 ms for 4 (transfer 0x559f6e53c7b0)
* Connected to localhost (::1) port 10902 (#0)
> GET /api/v1/series?match[]=messages_consumed_total%7Bkubernetes_namespace%3D~%22acs%7Ctelecom%22%7D&start=1552322141&end=1552408541 HTTP/1.1
> Host: localhost:10902
> User-Agent: curl/7.64.0
> Accept: */*
>
* Operation timed out after 150000 milliseconds with 0 bytes received
* Closing connection 0
curl: (28) Operation timed out after 150000 milliseconds with 0 bytes received

If i don't specify timeout in curl, it will go on forever.

This breaks our grafana dashboards :/

What you expected to happen

  1. Time out to work correctly
  2. non of it to actually happen, cause I'm not sure why it's hanging?
  3. maybe more logs on the slowest store API? because right now it's impossible to find it.

How to reproduce it (as minimally and precisely as possible):

Full logs to relevant components

Uncomment if you would like to post collapsible logs:

Logs

Thanos query debug logs:


level=info ts=2019-03-13T13:49:07.512593252Z caller=flags.go:87 msg="gossip is disabled"
level=info ts=2019-03-13T13:49:07.51565412Z caller=main.go:257 component=query msg="disabled TLS, key and cert must be set to enable"
level=info ts=2019-03-13T13:49:07.515694976Z caller=query.go:468 msg="starting query node"
level=info ts=2019-03-13T13:49:07.51582857Z caller=query.go:437 msg="Listening for query and metrics" address=0.0.0.0:10902
level=info ts=2019-03-13T13:49:07.515834613Z caller=query.go:460 component=query msg="Listening for StoreAPI gRPC" address=0.0.0.0:10901
level=info ts=2019-03-13T13:49:37.527189874Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527236509Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527253173Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527263894Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527273766Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527285924Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.energy:10901
level=info ts=2019-03-13T13:49:37.527298467Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.corp-mon:10901
level=info ts=2019-03-13T13:49:37.52731974Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-system-0.thanos-sidecar.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527333155Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527351515Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.telecom:10901
level=info ts=2019-03-13T13:49:37.527362503Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527386106Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-system-1.thanos-sidecar.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527400067Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.energy:10901
level=info ts=2019-03-13T13:49:37.527422702Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.energy:10901
level=info ts=2019-03-13T13:49:37.527452567Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.sys-mon:10901
level=info ts=2019-03-13T13:49:37.527500935Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-0.thanos-sidecar.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527519217Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.energy:10901
level=info ts=2019-03-13T13:49:37.52754057Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.telecom:10901
level=info ts=2019-03-13T13:49:37.527557559Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.telecom:10901
level=info ts=2019-03-13T13:49:37.52757928Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-store.telecom:10901
level=info ts=2019-03-13T13:49:37.527597595Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.corp-mon:10901
level=info ts=2019-03-13T13:49:37.527629153Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=prometheus-1.thanos-sidecar.customer-platform:10901
level=info ts=2019-03-13T13:49:37.527647733Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.sys-prom:10901
level=info ts=2019-03-13T13:49:37.527669375Z caller=storeset.go:250 component=storeset msg="adding new store to query storeset" address=thanos-rule.customer-platform:10901
level=debug ts=2019-03-13T13:51:10.209916543Z caller=proxy.go:176 msg="store Addr: thanos-store.customer-platform:10901 Labels: [] Mint: 1549807838621 Maxt: 1552478400000 queried;store Addr: thanos-store.sys-prom:10901 Labels: [] Mint: 1551571200000 Maxt: 1552478400000 queried;store Addr: prometheus-1.thanos-sidecar.telecom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor telecom-prometheus {} [] 0} {replica telecom-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.telecom:10901 Labels: [] Mint: 1535673600000 Maxt: 1552478400000 queried;store Addr: thanos-store.energy:10901 Labels: [] Mint: 1548951238613 Maxt: 1552478400000 queried;store Addr: prometheus-0.thanos-sidecar.corp-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor corp-prometheus {} [] 0} {replica corp-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.telecom:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-store.sys-mon:10901 Labels: [] Mint: 1551744000000 Maxt: 1552478400000 queried;store Addr: prometheus-0.thanos-sidecar.energy:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor energy-prometheus {} [] 0} {replica energy-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552399200000 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.corp-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor corp-prometheus {} [] 0} {replica corp-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.telecom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor telecom-prometheus {} [] 0} {replica telecom-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-system-1.thanos-sidecar.sys-prom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-prom-prometheus {} [] 0} {replica sys-prom-prometheus-system-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.corp-mon:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.sys-prom:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.sys-mon:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.sys-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-mon-prometheus {} [] 0} {replica sys-mon-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.energy:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.customer-platform:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.customer-platform:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor customer-platform-prometheus {} [] 0} {replica customer-platform-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.sys-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-mon-prometheus {} [] 0} {replica sys-mon-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.customer-platform:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor customer-platform-prometheus {} [] 0} {replica customer-platform-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.corp-mon:10901 Labels: [] Mint: 1550053933834 Maxt: 1552478400000 queried;store Addr: prometheus-system-0.thanos-sidecar.sys-prom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-prom-prometheus {} [] 0} {replica sys-prom-prometheus-system-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552377600000 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.energy:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor energy-prometheus {} [] 0} {replica energy-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552399200000 Maxt: 9223372036854775807 queried"
level=debug ts=2019-03-13T14:01:07.206028563Z caller=proxy.go:176 msg="store Addr: prometheus-1.thanos-sidecar.sys-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-mon-prometheus {} [] 0} {replica sys-mon-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.energy:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.customer-platform:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.sys-mon:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.customer-platform:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor customer-platform-prometheus {} [] 0} {replica customer-platform-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.corp-mon:10901 Labels: [] Mint: 1550053933834 Maxt: 1552478400000 queried;store Addr: prometheus-system-0.thanos-sidecar.sys-prom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-prom-prometheus {} [] 0} {replica sys-prom-prometheus-system-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552377600000 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.energy:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor energy-prometheus {} [] 0} {replica energy-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552399200000 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.customer-platform:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor customer-platform-prometheus {} [] 0} {replica customer-platform-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.sys-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-mon-prometheus {} [] 0} {replica sys-mon-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.sys-prom:10901 Labels: [] Mint: 1551571200000 Maxt: 1552478400000 queried;store Addr: prometheus-1.thanos-sidecar.telecom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor telecom-prometheus {} [] 0} {replica telecom-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.telecom:10901 Labels: [] Mint: 1535673600000 Maxt: 1552478400000 queried;store Addr: thanos-store.customer-platform:10901 Labels: [] Mint: 1549807838621 Maxt: 1552478400000 queried;store Addr: prometheus-0.thanos-sidecar.energy:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor energy-prometheus {} [] 0} {replica energy-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552399200000 Maxt: 9223372036854775807 queried;store Addr: prometheus-1.thanos-sidecar.corp-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor corp-prometheus {} [] 0} {replica corp-prometheus-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-0.thanos-sidecar.telecom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor telecom-prometheus {} [] 0} {replica telecom-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: prometheus-system-1.thanos-sidecar.sys-prom:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor sys-prom-prometheus {} [] 0} {replica sys-prom-prometheus-system-1 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552384800000 Maxt: 9223372036854775807 queried;store Addr: thanos-store.energy:10901 Labels: [] Mint: 1548951238613 Maxt: 1552478400000 queried;store Addr: prometheus-0.thanos-sidecar.corp-mon:10901 Labels: [{cloud_provider aws {} [] 0} {kubernetes_cluster dev-aws {} [] 0} {monitor corp-prometheus {} [] 0} {replica corp-prometheus-0 {} [] 0} {uw_environment dev {} [] 0}] Mint: 1552298400000 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.telecom:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-store.sys-mon:10901 Labels: [] Mint: 1551744000000 Maxt: 1552478400000 queried;store Addr: thanos-rule.sys-prom:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried;store Addr: thanos-rule.corp-mon:10901 Labels: [] Mint: 0 Maxt: 9223372036854775807 queried"

@povilasv
Copy link
Member Author

povilasv commented Mar 15, 2019

New details:

Turns out we have a really slow Thanos store, I think it's practically sleeping when streaming data :D but as it has all the needed time ranges and labels we started seeing issues for all the queries.

So none of the queries would load.

@bwplotka
Copy link
Member

Yea, I think having quicker store is must have though, so you should look into that first (:

@bwplotka
Copy link
Member

we can try fixing timeouts though

@povilasv
Copy link
Member Author

I will look into Store, but this for me was like really good chance to make thanos query super reliable, as I had all the data and queries kept failing :P

@povilasv
Copy link
Member Author

This is dup of #705 so closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants