-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TransformProcessor doesn't set TraceID and SpanID within LogContext #32190
Comments
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
After looking more in details w3c tracing protocol I figured out that the version part in the traceparent header value is just one byte, so assuming another byte is taken for '-', the substring function should start reading from byte at index 2 until 16 bytes are read. I tested with 2 as starting index instead of 3, but this results in the same error. My guess is that the header traceparent itself when set as attribute by envoy access log service in the telemetry payload is represented with wrong encoding and so the error propagate downstream in data processing. I didn't find anything about hex encoding conversion in the util funcitons in OTTL. |
We have #31929 to track a processors:
transform/envoyal:
error_mode: ignore
log_statements:
- context: log
statements:
- set(attributes["trace-id"], Substring(attributes["traceparent"], 3, 32))
- set(attributes["parent-id"], Substring(attributes["traceparent"], 36, 16))
- set(trace_id.string, attributes["trace-id"])
- set(span_id.string, attributes["parent-id"]) That should capture |
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
I believe this issue is solved by the new Hex function #31929. Please ping me if you disagree |
Component(s)
processor/transform
What happened?
Description
Hello, I have have setup a pipeline with the transform processor to extract TraceID and SpanId data from traceparent http header and set the trace_id and span_id in the LogContext.
The key here is:
Steps to Reproduce
Expected Result
This is the payload I expect:
Actual Result
This is the payload I expect:
Hope you can help me with this. Thanks in advance for your time.
Collector version
0.97.0
Environment information
Os: Ubuntu 22.04
OpenTelemetry Collector configuration
Log output
{"level":"error","ts":1712330461.6687455,"caller":"logs/processor.go:54","msg":"failed processing logs","kind":"processor","name":"transform/envoyal","pipeline":"logs","error":"failed to execute statement: set(trace_id.string, attributes["trace-id"]), trace ids must be 32 hex characters","stacktrace":"github.com/open-telemetry/opentelemetry-collector-contrib/processor/transformprocessor/internal/logs.(*Processor).ProcessLogs\n\tgithub.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/logs/processor.go:54\ngo.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1\n\tgo.opentelemetry.io/collector/[email protected]/processorhelper/logs.go:48\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/[email protected]/logs.go:25\ngo.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1\n\tgo.opentelemetry.io/collector/[email protected]/processorhelper/logs.go:56\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/[email protected]/logs.go:25\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/[email protected]/logs.go:25\ngo.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs\n\tgo.opentelemetry.io/[email protected]/internal/fanoutconsumer/logs.go:62\ngo.opentelemetry.io/collector/receiver/otlpreceiver/internal/logs.(*Receiver).Export\n\tgo.opentelemetry.io/collector/receiver/[email protected]/internal/logs/otlp.go:41\ngo.opentelemetry.io/collector/pdata/plog/plogotlp.rawLogsServer.Export\n\tgo.opentelemetry.io/collector/[email protected]/plog/plogotlp/grpc.go:88\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler.func1\n\tgo.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/logs/v1/logs_service.pb.go:311\ngo.opentelemetry.io/collector/config/configgrpc.(*ServerConfig).toServerOption.enhanceWithClientInformation.func9\n\tgo.opentelemetry.io/collector/config/[email protected]/configgrpc.go:398\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler\n\tgo.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/logs/v1/logs_service.pb.go:313\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\tgoogle.golang.org/[email protected]/server.go:1386\ngoogle.golang.org/grpc.(*Server).handleStream\n\tgoogle.golang.org/[email protected]/server.go:1797\ngoogle.golang.org/grpc.(*Server).serveStreams.func2.1\n\tgoogle.golang.org/[email protected]/server.go:1027"}
Additional context
The text was updated successfully, but these errors were encountered: