Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Submit DBM query samples via new aggregator API #9045

Merged
merged 2 commits into from
Apr 16, 2021

Conversation

djova
Copy link
Contributor

@djova djova commented Mar 26, 2021

What does this PR do?

Follow-up to #9165: Update python & mysql checks to submit DBM events via the new aggregator API.

Also improves the tests to better handle threads.

Motivation

Submit events using the more robust agent go code to with proper batching, buffering, retries, error handling, and tracking of internal statistics.

Review checklist (to be filled by reviewers)

  • Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
  • PR title must be written as a CHANGELOG entry (see why)
  • Files changes must correspond to the primary purpose of the PR as described in the title (small unrelated changes should have their own PR)
  • PR must have changelog/ and integration/ labels attached

djova added a commit to DataDog/datadog-agent that referenced this pull request Apr 9, 2021
Add a new aggregator API through which checks can submit "event platform events" of various types.

All supported `eventTypes` are hardcoded in `EventPlatformForwarder`.

The `dbm-samples` and `dbm-metrics` events are expected to arrive fully serialized so their pipelines are simply "HTTP passthrough" pipelines which skip all of the other features of logs pipelines like processing rules and encoding.

Future event types will be able to add more detailed processing if they need it.

**Overall flow**

1. `aggregator.submit_event_platform_event(check_id, rawEvent, "{eventType}")` - python API. Here's how the postgres check would be updated to use it: DataDog/integrations-core#9045.
2. `BufferedAggregator` forwards events to the `EventPlatformForwarder`. Events are **dropped** here if `EventPlatformForwarder` is backed up for any reason.
3. `EventPlatformForwarder` forwards events to the pipeline for the given `eventType`, **dropping** events for unknown `eventTypes`

**Internal Agent Stats**

*Prometheus*: `aggregator.flush - data_type:{eventType}, state:{ok|error}`

*ExpVar*: `EventPlatformEvents` & `EventPlatformEventsErrors`: counts by `eventType`

**User-Facing Agent Stats**

Statistics for each `eventType` will be tracked alongside other types of telemetry (`Service Checks`, `Series`, ...). Where appropriate, the raw `eventType` is translated to a human readable name (i.e. `dbm-samples` --> `Database Monitoring Query Samples`).

`agent status` output:
```
=========
Collector
=========

  Running Checks
  ==============

    postgres (5.4.0)
    ----------------
      Instance ID: postgres:1df52d84fb6f603c [OK]
      Metric Samples: Last Run: 366, Total: 7,527
      Database Monitoring Query Samples: Last Run: 11, Total: 176
      ...

=========
Aggregator
=========
  Checks Metric Sample: 29,818
  Database Monitoring Query Samples: 473
  ...
```

`agent check {check_name}` output:

```
=== Metrics ===
...
=== Database Monitoring Query Samples ===
...
```

`agent check {check_name} --json` output will use the raw event types instead of the human readable names:

```
"aggregator": {
  "metrics": [...],
  "dbm-samples": [...],
  ...
}
```

**Motivation**

The posting of statement samples payloads to the intake for postgres & mysql checks is done directly from python as of (DataDog/integrations-core#8627, DataDog/integrations-core#8629). With this change we'll be able to move responsibility for posting payloads to the more robust agent go code with proper batching, buffering, retries, error handling, and tracking of statistics.
@djova djova changed the title postgres: submit DBM query samples via new aggregator API postgres & mysql: submit DBM query samples via new aggregator API Apr 9, 2021
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch from e1e1f3f to b60ba71 Compare April 9, 2021 19:01
@ghost ghost added the integration/mysql label Apr 9, 2021
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch 2 times, most recently from 165793d to d2ac40b Compare April 12, 2021 14:00
Copy link
Contributor

@ofek ofek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Let's modify checks in a separate PR

djova added a commit that referenced this pull request Apr 14, 2021
As of #9045 this client is unused and can be removed.
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch from acd080b to 04c9b3a Compare April 14, 2021 18:45
@djova
Copy link
Contributor Author

djova commented Apr 14, 2021

Let's modify checks in a separate PR

Good point. Split this out into two other PRs: #9165, #9166

djova added a commit that referenced this pull request Apr 15, 2021
As of #9045 this client is unused and can be removed.
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch from 04c9b3a to 6e00979 Compare April 15, 2021 19:51
remeh added a commit to DataDog/datadog-agent that referenced this pull request Apr 16, 2021
* add new generic event platform aggregator API

Add a new aggregator API through which checks can submit "event platform events" of various types.

All supported `eventTypes` are hardcoded in `EventPlatformForwarder`.

The `dbm-samples` and `dbm-metrics` events are expected to arrive fully serialized so their pipelines are simply "HTTP passthrough" pipelines which skip all of the other features of logs pipelines like processing rules and encoding.

Future event types will be able to add more detailed processing if they need it.

**Overall flow**

1. `aggregator.submit_event_platform_event(check_id, rawEvent, "{eventType}")` - python API. Here's how the postgres check would be updated to use it: DataDog/integrations-core#9045.
2. `BufferedAggregator` forwards events to the `EventPlatformForwarder`. Events are **dropped** here if `EventPlatformForwarder` is backed up for any reason.
3. `EventPlatformForwarder` forwards events to the pipeline for the given `eventType`, **dropping** events for unknown `eventTypes`

**Internal Agent Stats**

*Prometheus*: `aggregator.flush - data_type:{eventType}, state:{ok|error}`

*ExpVar*: `EventPlatformEvents` & `EventPlatformEventsErrors`: counts by `eventType`

**User-Facing Agent Stats**

Statistics for each `eventType` will be tracked alongside other types of telemetry (`Service Checks`, `Series`, ...). Where appropriate, the raw `eventType` is translated to a human readable name (i.e. `dbm-samples` --> `Database Monitoring Query Samples`).

`agent status` output:
```
=========
Collector
=========

  Running Checks
  ==============

    postgres (5.4.0)
    ----------------
      Instance ID: postgres:1df52d84fb6f603c [OK]
      Metric Samples: Last Run: 366, Total: 7,527
      Database Monitoring Query Samples: Last Run: 11, Total: 176
      ...

=========
Aggregator
=========
  Checks Metric Sample: 29,818
  Database Monitoring Query Samples: 473
  ...
```

`agent check {check_name}` output:

```
=== Metrics ===
...
=== Database Monitoring Query Samples ===
...
```

`agent check {check_name} --json` output will use the raw event types instead of the human readable names:

```
"aggregator": {
  "metrics": [...],
  "dbm-samples": [...],
  ...
}
```

**Motivation**

The posting of statement samples payloads to the intake for postgres & mysql checks is done directly from python as of (DataDog/integrations-core#8627, DataDog/integrations-core#8629). With this change we'll be able to move responsibility for posting payloads to the more robust agent go code with proper batching, buffering, retries, error handling, and tracking of statistics.

* simplify

* remove debug log

* move json marshaling to check.go

* check enabled before lock

* refactor, add noop ep forwarder

* Update pkg/collector/check/stats.go

Co-authored-by: maxime mouial <[email protected]>

* remove purge during flush

* remove global

* Update rtloader/include/datadog_agent_rtloader.h

Co-authored-by: Rémy Mathieu <[email protected]>

* Update rtloader/common/builtins/aggregator.h

Co-authored-by: Rémy Mathieu <[email protected]>

* Update pkg/collector/check/stats.go

Co-authored-by: Rémy Mathieu <[email protected]>

* remove unnecessary

* rename lock

* refactor pipelines

* remove unnecessary nil check

* revert

* Update releasenotes/notes/event-platform-aggregator-api-33e92539f08ac5c2.yaml

Co-authored-by: Alexandre Yang <[email protected]>

* track processed

* move locking into ep forwarder

* move to top

* Update pkg/aggregator/aggregator.go

Co-authored-by: Alexandre Yang <[email protected]>

* remove read lock

* refactor error logging

* move to pkg/epforwarder

* update default dbm-metrics endpoint

* local var

Co-authored-by: maxime mouial <[email protected]>
Co-authored-by: Rémy Mathieu <[email protected]>
Co-authored-by: Alexandre Yang <[email protected]>
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch from 6e00979 to f1ee623 Compare April 16, 2021 14:28
@djova djova marked this pull request as ready for review April 16, 2021 14:29
@djova djova requested a review from a team as a code owner April 16, 2021 14:29
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch 2 times, most recently from 8665e37 to b7e4fdb Compare April 16, 2021 14:40
Follow-up to #9165: Update python & mysql checks to submit DBM events via the new aggregator API.

Also improves the tests to better handle threads.

Motivation: Submit events using the more robust agent go code to with proper batching, buffering, retries, error handling, and tracking of internal statistics.
@djova djova force-pushed the djova/postgres-send-samples-through-aggregator branch from b7e4fdb to 888f71d Compare April 16, 2021 14:42
@ofek ofek changed the title postgres & mysql: submit DBM query samples via new aggregator API Submit DBM query samples via new aggregator API Apr 16, 2021
@ofek ofek merged commit b58de78 into master Apr 16, 2021
@ofek ofek deleted the djova/postgres-send-samples-through-aggregator branch April 16, 2021 19:27
djova added a commit that referenced this pull request Apr 19, 2021
As of #9045 this client is unused and can be removed.
djova added a commit that referenced this pull request Apr 28, 2021
As of #9045 this client is unused and can be removed.
djova added a commit that referenced this pull request May 7, 2021
As of #9045 this client is unused and can be removed.
djova added a commit that referenced this pull request May 24, 2021
As of #9045 this client is unused and can be removed.
ofek pushed a commit that referenced this pull request May 25, 2021
As of #9045 this client is unused and can be removed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants