Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SimpleSpanProcessor support in OTel #29448

Closed
brunobat opened this issue Nov 23, 2022 · 7 comments
Closed

SimpleSpanProcessor support in OTel #29448

brunobat opened this issue Nov 23, 2022 · 7 comments
Assignees
Labels
area/tracing kind/enhancement New feature or request

Comments

@brunobat
Copy link
Contributor

brunobat commented Nov 23, 2022

Description

We need to support SimpleSpanProcessor in OTel, in order to allow lambda functions on cloud providers to report, in time their traces.
See #26250

The default processor accumulates spans and sends them in batches, therefore the it's called BatchSpanProcessor.
Lambda functions shut down quick and sometimes there is no time to send the traces out.
This feature will dispatch spans as they are created, therefore making sure the span goes out.

This applies to LogRecords as well.

Implementation ideas

This should be trivial after the OTel SPI access is available after the #26444 work is finished.
Should be a mater of documenting the procedure.

@geoand
Copy link
Contributor

geoand commented Jul 21, 2023

Is this still relevant?

@brunobat
Copy link
Contributor Author

Very... Especially for lambda functions.

@brunobat
Copy link
Contributor Author

brunobat commented Sep 5, 2023

FYI, This project can be used as a helper: https://github.com/brunobat/reproducers/tree/main/graalvm-aws-rest-26250

@brunobat
Copy link
Contributor Author

Assigning to @alesj

@brunobat
Copy link
Contributor Author

brunobat commented Oct 4, 2023

@arik-dig has this problem on tests that use openTelemetry. The test closes before the export finishes and throws an exception.
Really tight configs like these help but don't work all the time:

quarkus.otel.bsp.max.export.batch.size=1
quarkus.otel.bsp.schedule.delay=0.001S

@arik-dig
Copy link
Contributor

arik-dig commented Oct 4, 2023

when working with version 3.4.1 seeing the following warning:

19:57:32 WARNING traceId=, parentId=, spanId=, sampled= [io.op.sd.tr.ex.BatchSpanProcessor] (BatchSpanProcessor_WorkerThread-1) Exporter threw an Exception: java.lang.IllegalStateException: Client is closed
	at io.vertx.core.http.impl.HttpClientImpl.checkClosed(HttpClientImpl.java:696)
	at io.vertx.core.http.impl.HttpClientImpl.doRequest(HttpClientImpl.java:597)
	at io.vertx.core.http.impl.HttpClientImpl.request(HttpClientImpl.java:465)
	at io.vertx.grpc.client.impl.GrpcClientImpl.request(GrpcClientImpl.java:50)
	at io.quarkus.opentelemetry.runtime.exporter.otlp.VertxGrpcExporter.export(VertxGrpcExporter.java:91)
	at io.quarkus.opentelemetry.runtime.exporter.otlp.VertxGrpcExporter.export(VertxGrpcExporter.java:128)
	at io.opentelemetry.sdk.trace.export.BatchSpanProcessor$Worker.exportCurrentBatch(BatchSpanProcessor.java:326)
	at io.opentelemetry.sdk.trace.export.BatchSpanProcessor$Worker.flush(BatchSpanProcessor.java:271)
	at io.opentelemetry.sdk.trace.export.BatchSpanProcessor$Worker.run(BatchSpanProcessor.java:238)
	at java.base/java.lang.Thread.run(Thread.java:833)

as @brunobat mentioned when using the configs:

quarkus.otel.bsp.max.export.batch.size=1
quarkus.otel.bsp.schedule.delay=0.001S

it reduces the frequency of it happening, but its inconsistent.

@brunobat
Copy link
Contributor Author

This has been implemented here: #43983

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/tracing kind/enhancement New feature or request
Projects
Status: Done
Development

No branches or pull requests

4 participants