Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[v1.5.1] otelcol.connector.spanmetrics causes erroneous warnings if sourced from anything other then the otelcol.receiver.otlp #2334

Open
WesselAtWork opened this issue Jan 3, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@WesselAtWork
Copy link

WesselAtWork commented Jan 3, 2025

What's wrong?

Setup

As per the example on the docs:

otelcol.receiver.otlp "default" {
  http {}
  grpc {}

  output {
    traces  = [otelcol.connector.spanmetrics.default.input]
  }
}

otelcol.connector.spanmetrics "default" {
  // Since a default is not provided, the http.status_code dimension will be omitted
  // if the span does not contain http.status_code.
  dimension {
    name = "http.status_code"
  }

  // If the span is missing http.method, the connector will insert
  // the http.method dimension with value 'GET'.
  dimension {
    name = "http.method"
    default = "GET"
  }

  dimensions_cache_size = 333

  aggregation_temporality = "DELTA"

  histogram {
    unit = "s"
    explicit {
      buckets = ["333ms", "777s", "999h"]
    }
  }

  // The period on which all metrics (whose dimension keys remain in cache) will be emitted.
  metrics_flush_interval = "33s"

  namespace = "test.namespace"

  output {
    metrics = [otelcol.exporter.otlp.production.input]
  }
}

otelcol.exporter.otlp "production" {
  client {
    endpoint = sys.env("OTLP_SERVER_ENDPOINT")
  }
}

is fine.

There are no errors emitted from this configuration

Observation

However: sourcing it from anything else causes an annoying warning to be emitted from the alloy process.

i.e.

otelcol.receiver.otlp "default" {
  http {}
  grpc {}

  output {
    traces  = [otelcol.processor.tail_sampling.default.input]
  }
}

otelcol.processor.tail_sampling "default" {
  decision_wait               = "1s"
  policy {
    name = "drop-probes-scrapes"
    type = "string_attribute"

    string_attribute {
      key                    = "user_agent.original"
      values                 = ["Prometheus.+", "kube-probe.+"]
      enabled_regex_matching = true
      invert_match           = true
      cache_max_size         = 4
    }
  }

  output {
    traces  = [otelcol.exporter.otlp.production.input, otelcol.processor.spanmetrics.default.input]
  }
}

Caused:

ts=2025-01-01T10:10:10.101.101000000Z level=warn msg="Sender failed" component_path=/ component_id=otelcol.processor.tail_sampling.default error="telemetry type is not supported"

to double check this, I added a batch processor:

...
  output {
    traces  = [otelcol.exporter.otlp.production.input, otelcol.processor.batch.issue_test.input]
  }
}

otelcol.processor.batch "issue_test" {
  timeout = "1s"
  output {
    traces  = [otelcol.processor.spanmetrics.default.input]
  }
}

And it caused a similiar warning:

ts=2025-01-01T10:10:10.101.101000000Z level=warn msg="Sender failed" component_path=/ component_id=otelcol.processor.batch.issue_test error="telemetry type is not supported"

I interpret this as the spanmetrics component being wrong.

Issue

The reason I call it "annoying" is because I have the metrics.
This pipeline is working fine, it's just emitting wrong or erroneous warnings.

Am I doing something wrong with the connector? Is the only "supported" way to hook it directly up to the receiver?

Steps to reproduce

Craft a alloy pipeline that uses a otelcol.connector.spanmetrics processor that is sourced from a component that is not the otelcol.receiver.otlp.

e.g. otelcol.processor.batch or otelcol.processor.tail_sampling

System information

Linux (amd64) [Container]

Software version

v1.5.1

Configuration


Logs

ts=2025-01-01T10:10:10.101.101000000Z level=warn msg="Sender failed" component_path=/ component_id=otelcol.processor.tail_sampling.default error="telemetry type is not supported"
ts=2025-01-01T10:10:10.101.101000000Z level=warn msg="Sender failed" component_path=/ component_id=otelcol.processor.batch.issue_test error="telemetry type is not supported"
@WesselAtWork WesselAtWork added the bug Something isn't working label Jan 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant