Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loki /loki/api/v1/label/<name>/values : query paramer not respected #10993

Closed
coderazzi opened this issue Oct 22, 2023 · 7 comments
Closed

Loki /loki/api/v1/label/<name>/values : query paramer not respected #10993

coderazzi opened this issue Oct 22, 2023 · 7 comments
Labels
component/api type/bug Somehing is not working as expected

Comments

@coderazzi
Copy link

Describe the bug
Using HTTP API interface, querying for labels with a stream selector produces the same result as no selector used

To Reproduce
Steps to reproduce the behavior:

  1. Started Loki 2.9.2
  2. Pushed several logs, with the following labels (this push is manual, using HTTP API):
  • dn='4dc71fb5-b608-4078-8685-895cb9403bd6_0', type='Request'
  • idn='773ad8a9-8706-4981-a222-a9537da8fc55_0', type='Request'
  • idn='c58a300e-5247-4ad0-ad10-ed3d555c07b2_5', type='Request'
  • idn='c58a300e-5247-4ad0-ad10-ed3d555c07b2_6', type='Result'
  • idn='e2f6d0cc-6e44-4978-8a34-6b1d1832bde6_0', type='MRA_Result'
  1. curl -G http://localhost:3100/loki/api/v1/label/idn/values --data-urlencode 'query={type="Result"}'

Expected behavior
Only one item should be returned.
This happens in fact on the first tries. But after a while, the response contains all five entries.
Restarting Loki doesn't help.
Creating a query with invalid stream selector (selector or value) also produces the five entries:

curl -G http://localhost:3100/loki/api/v1/label/idn/values --data-urlencode 'query={typeINVALID="Result"}'

Doing curl -G http://localhost:3100/loki/api/v1/label/typeINVALID/values
succeeds, but returns nothing, as expected

Note that Grafana shows the entries properly. Doing a query using the respective stream selectors only produce the desired results

Environment:

  • Infrastructure: docker
  • Deployment tool: manual

Screenshots, Promtail config, or terminal output
The logs are pushed manually

@coderazzi
Copy link
Author

Additional information:

The five entries listed above are pushed at 18:33.

Querying labels works perfectly fine at the beginning, returning just the exact number of entries per type

level=info ts=2023-10-22T19:03:26.833500229Z caller=roundtrip.go:295 org_id=fake traceID=39236af0cae60391 msg="executing query" type=labels label=idn length=1h0m0s query="{type=\"Request\"}"

ts=2023-10-22T19:03:26.834392327Z caller=spanlogger.go:86 user=fake level=info org_id=fake traceID=39236af0cae60391 latency=fast query_type=labels length=1h0m0s duration=497.792µs status=200 label=idn query="{type=\"Request\"}" splits=0 throughput=0B total_bytes=0B total_entries=3

level=info ts=2023-10-22T19:03:26.83465971Z caller=metrics.go:207 component=frontend org_id=fake traceID=39236af0cae60391 latency=fast query_type=labels length=1h0m0s duration=1.08216ms status=200 label=idn query="{type=\"Request\"}" splits=0 throughput=0B total_bytes=0B total_entries=3

Then the logs produce the following messages:
level=info ts=2023-10-22T19:03:47.327438206Z caller=flush.go:167 msg="flushing stream" user=fake fp=e5b7299bb98bd3f4 immediate=false num_chunks=1 labels="{idn=\"e2f6d0cc-6e44-4978-8a34-6b1d1832bde6_0\", type=\"MRA_Result\"}"

level=info ts=2023-10-22T19:03:47.327582166Z caller=flush.go:167 msg="flushing stream" user=fake fp=f58f38e72fc83d11 immediate=false num_chunks=1 labels="{idn=\"773ad8a9-8706-4981-a222-a9537da8fc55_0\", type=\"Request\"}"

level=info ts=2023-10-22T19:03:47.327553007Z caller=flush.go:167 msg="flushing stream" user=fake fp=257fe24b54141668 immediate=false num_chunks=1 labels="{idn=\"4dc71fb5-b608-4078-8685-895cb9403bd6_0\", type=\"Request\"}"

level=info ts=2023-10-22T19:03:47.327634311Z caller=flush.go:167 msg="flushing stream" user=fake fp=a4a668ec4d4ee53e immediate=false num_chunks=1 labels="{idn=\"c58a300e-5247-4ad0-ad10-ed3d555c07b2_5\", type=\"Request\"}"

level=info ts=2023-10-22T19:03:47.422775203Z caller=flush.go:167 msg="flushing stream" user=fake fp=6c5c1962fcef0c51 immediate=false num_chunks=1 labels="{idn=\"c58a300e-5247-4ad0-ad10-ed3d555c07b2_6\", type=\"Result\"}"

level=info ts=2023-10-22T19:04:13.537416624Z caller=table_manager.go:136 index-store=boltdb-shipper-2023-10-01 msg="uploading tables"

level=info ts=2023-10-22T19:04:13.537440343Z caller=table_manager.go:171 index-store=boltdb-shipper-2023-10-01 msg="handing over indexes to shipper"

level=info ts=2023-10-22T19:04:13.537511399Z caller=table.go:318 msg="handing over indexes to shipper index_19652"

level=info ts=2023-10-22T19:04:13.537529216Z caller=table.go:334 msg="finished handing over table index_19652"

level=info ts=2023-10-22T19:04:18.683048116Z caller=marker.go:202 msg="no marks file found"

level=info ts=2023-10-22T19:05:01.57698067Z caller=checkpoint.go:498 msg="atomic checkpoint finished" old=/mnt/loki/wal/checkpoint.000006.tmp new=/mnt/loki/wal/checkpoint.000006

level=info ts=2023-10-22T19:05:01.577375124Z caller=checkpoint.go:569 msg="checkpoint done" time=1m48.013444337s

level=info ts=2023-10-22T19:05:13.537042037Z caller=table_manager.go:136 index-store=boltdb-shipper-2023-10-01 msg="uploading tables"

level=info ts=2023-10-22T19:05:13.538192161Z caller=table_manager.go:171 index-store=boltdb-shipper-2023-10-01 msg="handing over indexes to shipper"

level=info ts=2023-10-22T19:05:13.53828372Z caller=table.go:318 msg="handing over indexes to shipper index_19652"

level=info ts=2023-10-22T19:05:13.538308085Z caller=table.go:334 msg="finished handing over table index_19652"

level=info ts=2023-10-22T19:05:18.682643822Z caller=marker.go:202 msg="no marks file found"

After this, the queries fail, returning now all entries, independent of the stream selector, returning 5 entries for each type, independent of the type. Note that the issue starts happening at 19:03, around 30 minutes after the initial push.

@JStickler JStickler added component/api type/bug Somehing is not working as expected labels Oct 23, 2023
@coderazzi
Copy link
Author

The error is not such. It is needed to use as well the start or since parameters in the query, and the right results are returned properly.
There is still an error in that the documentation states that the default is to provide results in the last 6 hours frame, while it seems that this period is only 1 hour.

@coderazzi
Copy link
Author

I will just create a separate issue for this documentation issue.

@coderazzi
Copy link
Author

After additional tests, including since or start, or any combinations with end, still fail to produce the right results. Reopening issue...

@coderazzi coderazzi reopened this Oct 26, 2023
@periklis
Copy link
Collaborator

@coderazzi Probably a duplicate isssue for #10759 and the fix is currently only in main. @JoaoBraveCoding provided backports that await review/approval by the maintainers.

@coderazzi
Copy link
Author

@periklis I agree with the duplicate suggestion.
I have done the same queries after six hours and now I get the correct results (which is probably the reason why I closed this issue in false)

@periklis
Copy link
Collaborator

Ok closing then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/api type/bug Somehing is not working as expected
Projects
None yet
Development

No branches or pull requests

3 participants