Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[processor/tailsampling] Fixed sampling policy evaluation debug logging batch metrics #37040

Merged
merged 4 commits into from
Jan 8, 2025

Conversation

portertech
Copy link
Contributor

Description

Currently, the processor always logs (at debug) sampled=0 and notSampled=0 for every batch processed. This pull-request fixes those metrics.

@portertech portertech requested review from jpkrohling and a team as code owners January 6, 2025 20:50
@github-actions github-actions bot added the processor/tailsampling Tail sampling processor label Jan 6, 2025
@jpkrohling jpkrohling changed the title [tailsamplingprocessor] Fixed sampling policy evaluation debug logging batch metrics [processor/tailsampling] Fixed sampling policy evaluation debug logging batch metrics Jan 7, 2025
Copy link
Member

@jpkrohling jpkrohling left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this would deserve a new test, to avoid a regression, but I'm OK merging this as is, given that this is only used in a debug logging at the moment. Let me know what you prefer, @portertech.

Signed-off-by: Sean Porter <[email protected]>
@portertech
Copy link
Contributor Author

@jpkrohling Thank you for the review! I added a test, is that what you had imagined?

@jpkrohling jpkrohling merged commit 617b0bb into open-telemetry:main Jan 8, 2025
161 checks passed
@github-actions github-actions bot added this to the next release milestone Jan 8, 2025
AkhigbeEromo pushed a commit to sematext/opentelemetry-collector-contrib that referenced this pull request Jan 13, 2025
…ng batch metrics (open-telemetry#37040)

#### Description

Currently, the processor always logs (at debug) `sampled=0` and
`notSampled=0` for every batch processed. This pull-request fixes those
metrics.

---------

Signed-off-by: Sean Porter <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
processor/tailsampling Tail sampling processor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants