Skip to content

Commit

Permalink
adding v0
Browse files Browse the repository at this point in the history
  • Loading branch information
l0k0ms committed Oct 11, 2018
1 parent 59777f9 commit 7eb419b
Show file tree
Hide file tree
Showing 65 changed files with 16,578 additions and 2 deletions.
8 changes: 8 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM python:3.7.0-alpine3.8
LABEL maintainer="Datadog Inc. <[email protected]>"

# following line needed to build psycopg2
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev
COPY requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
48 changes: 46 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,46 @@
# log-workshop-2
2nd log workshop
# Distributed Tracing with APM Workshop

This will be a follow up repo to my [2018 Dash APM Workshop](https://github.com/burningion/dash-apm-workshop), incorporating feedback from the event.

Specifically, this will add:

* Starting with Automatic Instrumentation, and a more complex example program
* More live traffic, to see Trace Search (announced at DASH)
* Debugging a bigger, more complex system, to showcase a more real world use case
* More Datadog UI usage
* More relevant examples and names for traces
* More realistic errors
* ... and should also work in Windows.

In the meantime, unless otherwise noted, this repository is a work in progress.

If you've stumbled upon it and have feedback, or have something you'd like to see, feel free to create an issue.

# Running the Application

![Water Sensor App](https://github.com/burningion/distributed-tracing-with-apm-workshop/raw/master/images/dashboard.png)

You'll need to have a Datadog account with APM and logging enabled. A free trial should work to play with.

```bash
$ POSTGRES_USER=postgres POSTGRES_PASSWORD=<pg password> DD_API_KEY=<api key> docker-compose up
```

You can open the web app at `http://localhost:5000`, create some pumps, and look at your Datadog traces to see the distributed traces.

The frontend of the app is a React node app using [Material UI](https://material-ui.com/). It lives in the `single-page-frontend` folder. You can start it up for development with a:

```bash
$ npm install
$ npm start
```

It should connect to the running frontend API container, allowing for easier development. When you're finished making changes, you can do a `npm build`, and then copy the javascript from the `build` subdirectory into the Flask frontend app.

# Ideas for Live Debugging via Tracing

These are some ideas, that have yet to be implemented:

* A bad deploy that triggers a problem, breaking parts of API
* Introducing latency in a service in the middle of the request lifecycle
* Introducing a traffic spike / poison payload downstream
130 changes: 130 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
version: '3'
services:
agent:
image: "datadog/agent:latest"
container_name: "datadog-agent"
environment:
- DD_API_KEY
- DD_APM_ENABLED=true
- DD_HOSTNAME=datadog-workshop
# Add DD_APM_ANALYZED_SPANS to the Agent container environment.
# Compatible with version 12.6.5250 or above.
- DD_APM_ANALYZED_SPANS=users-api|express.request=1,sensors-api|flask.request=1,pumps-service|flask.request=1,iot-frontend|flask.request=1
- DD_TAGS='env:workshop'
- DD_PROCESS_AGENT_ENABLED=true
- DD_LOGS_ENABLED=true
- DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL=true

ports:
- "8126:8126"
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
- /proc/:/host/proc/:ro
- /sys/fs/cgroup/:/host/sys/fs/cgroup:ro
- /etc/passwd:/etc/passwd:ro
labels:
com.datadoghq.ad.logs: '[{"source": "docker", "service": "agent"}]'

frontend:
container_name: "frontend"
environment:
- FLASK_APP=api.py
- FLASK_DEBUG=1
build: .
command: flask run --port=5000 --host=0.0.0.0
ports:
- "5000:5000"
volumes:
- "./frontend:/app"
depends_on:
- db
labels:
com.datadoghq.ad.logs: '[{"source": "iot-frontend", "service": "iot-frontend"}}]'

noder:
container_name: "noder"
build: ./node-api
command: nodemon server.js
ports:
- "5004:5004"
volumes:
- "./node-api:/app"
- /app/node_modules
depends_on:
- frontend
- redis
environment:
- DD_ENV=workshop
labels:
com.datadoghq.ad.logs: '[{"source": "noder", "service": "noder"}]'

pumps:
container_name: "pumps"
environment:
- FLASK_APP=thing.py
- FLASK_DEBUG=1
- POSTGRES_PASSWORD
- POSTGRES_USER
build: ./iot-devices
command: flask run --port=5001 --host=0.0.0.0
ports:
- "5001:5001"
volumes:
- "./iot-devices:/app"
depends_on:
- frontend
- db
labels:
com.datadoghq.ad.logs: '[{"source": "pumps-service", "service": "pumps-service"}]'

redis:
container_name: "redis"
image: redis:latest
depends_on:
- agent
labels:
com.datadoghq.ad.logs: '[{"source": "redis", "service": "redis"}]'

sensors:
container_name: "sensors"
environment:
- FLASK_APP=sensors.py
- FLASK_DEBUG=1
- POSTGRES_PASSWORD
- POSTGRES_USER
build: ./sensors
command: flask run --port=5002 --host=0.0.0.0
ports:
- "5002:5002"
volumes:
- "./sensors:/app"
depends_on:
- frontend
- db
labels:
com.datadoghq.ad.logs: '[{"source": "sensors", "service": "sensors-api"}]'

db:
container_name: "postgres"
image: postgres:11-alpine
restart: always
environment:
- POSTGRES_PASSWORD
- POSTGRES_USER
ports:
- 5432:5432
depends_on:
- agent
labels:
com.datadoghq.ad.logs: '[{"source": "postgres", "service": "postgres"}]'

adminer:
container_name: "adminer"
image: adminer
restart: always
ports:
- 8080:8080
depends_on:
- agent
labels:
com.datadoghq.ad.logs: '[{"source": "adminer", "service": "adminer"}]'
Binary file added frontend/__pycache__/api.cpython-37.pyc
Binary file not shown.
92 changes: 92 additions & 0 deletions frontend/api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
import requests

from flask import Flask, Response, jsonify, render_template
from flask import request as flask_request

from flask_cors import CORS
import os

from ddtrace import tracer, patch, config
from ddtrace.contrib.flask import TraceMiddleware
import logging
import subprocess

# Tracer configuration
tracer.configure(hostname='agent')
tracer.set_tags({'env': 'workshop'})
patch(requests=True)

# enable distributed tracing for requests
# to send headers (globally)
config.requests['distributed_tracing'] = True

app = Flask('api')

if os.environ['FLASK_DEBUG']:
CORS(app)

traced_app = TraceMiddleware(app, tracer, service='iot-frontend')

@app.route('/')
def homepage():
return app.send_static_file('index.html')

@app.route('/service-worker.js')
def service_worker_js():
return app.send_static_file('js/service-worker.js')

@app.route('/status')
def system_status():
status = requests.get('http://sensors:5002/sensors').json()
app.logger.info(f"Sensor status: {status}")
pumps = requests.get('http://pumps:5001/devices').json()
users = requests.get('http://noder:5004/users').json()
return jsonify({'sensor_status': status, 'pump_status': pumps, 'users': users})

@app.route('/users', methods=['GET', 'POST'])
def users():
if flask_request.method == 'POST':
newUser = flask_request.get_json()
userStatus = requests.post('http://noder:5004/users', json=newUser).json()
return jsonify(userStatus)
elif flask_request.method == 'GET':
users = requests.get('http://noder:5004/users').json()
return jsonify(users)

@app.route('/add_sensor')
def add_sensor():
sensors = requests.post('http://sensors:5002/sensors').json()
app.logger.info(f"Adding {sensors}")
return jsonify(sensors)

@app.route('/add_pump', methods=['POST'])
def add_pump():
pumps = requests.post('http://pumps:5001/devices').json()
app.logger.info(f"Adding {pumps} to the pumps pool")
return jsonify(pumps)

@app.route('/generate_requests', methods=['POST'])
def call_generate_requests():
payload = flask_request.get_json()
span = tracer.current_span()
app.logger.info(f"Looking at {span}")
app.logger.info(f"with span id {span.span_id}")
span = tracer.current_span()

span.set_tags({'requests': payload['total'], 'concurrent': payload['concurrent']})

output = subprocess.check_output(['/app/traffic_generator.py',
str(payload['concurrent']),
str(payload['total']),
str(payload['url'])])
app.logger.info(f"Result for subprocess call: {output}")
return jsonify({'traffic': str(payload['concurrent']) + ' concurrent requests generated, ' +
str(payload['total']) + ' requests total.',
'url': payload['url']})

@app.route('/simulate_sensors')
def simulate_sensors():
sensors = requests.get('http://sensors:5002/refresh_sensors').json()
#app.logger.info(f"Simulating sensors {sensors}")
app.logger.info(f"LOOOL")
return jsonify(sensors)
1 change: 1 addition & 0 deletions frontend/static/index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
<!doctype html><html lang="en" dir="ltr"><head><meta charset="utf-8"><meta name="viewport" content="minimum-scale=1,initial-scale=1,width=device-width,shrink-to-fit=no"><meta name="theme-color" content="#000000"><title>Datadog APM Water Management Example</title><link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500"/></head><body><div id="root"></div><script type="text/javascript" src="/static/js/main.df577d7c.js"></script></body></html>
2 changes: 2 additions & 0 deletions frontend/static/js/main.df577d7c.js

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions frontend/static/js/main.df577d7c.js.map

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions frontend/static/js/service-worker.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

14 changes: 14 additions & 0 deletions frontend/templates/index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
<html>
<head>
<title>Hello from Flask</title>
</head>
<body>
{% if sensors['sensor_count'] != 0 %}
<h1>Sensors</h1>
{{ sensors }}
{% else %}
<h1>Create a Sensor</h1>
<button>Add Sensor</button>
{% endif %}
</body>
</html>
40 changes: 40 additions & 0 deletions frontend/traffic_generator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/usr/local/bin/python
from ddtrace import tracer, patch, config, Pin
tracer.configure(hostname='agent')
patch(requests=True,futures=True,asyncio=True)

tracer.set_tags({'env': 'workshop'})
tracer.debug_logging = True

import asyncio
import argparse
from requests_threads import AsyncSession
# enable distributed tracing for requests
# to send headers (globally)

import logging
logger = logging.getLogger()
config.requests['distributed_tracing'] = True

parser = argparse.ArgumentParser(description='Concurrent Traffic Generator')
parser.add_argument('concurrent', type=int, help='Number of Concurrent Requests')
parser.add_argument('total', type=int, help='Total number of Requests to Make')
parser.add_argument('url', type=str, help='URL to fetch')
args = parser.parse_args()

asyncio.set_event_loop(asyncio.new_event_loop())

session = AsyncSession(n=args.concurrent)
Pin.override(session, service='concurrent-requests-generator')

async def generate_requests():
with tracer.trace('flask.request', service='concurrent-requests-generator') as span:
rs = []
for _ in range(args.total):
rs.append(session.get(args.url))
for i in range(args.total):
rs[i] = await rs[i]
print(rs)

session.run(generate_requests)
session.close()
Binary file added images/.DS_Store
Binary file not shown.
Binary file added images/dashboard.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/dd_logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/agent_filtered_out.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/empty_log_explorer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/index_filter_agent_log.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/installed_integrations.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/live_tail_agent.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/log_flow.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/log_flow_with_service.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/metrics_switch_to_logs.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/parsed_redis_log.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/pipeline_page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/removing_debug_logs.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/logs_workshop/sensors_api_switch.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/workshop-architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions iot-devices/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM arm64v8/python:3.7.0-alpine3.8

RUN apk add build-base eudev-dev openzwave openzwave-dev cython postgresql-dev
COPY requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
# uncomment below to build zwave for ARM, also add Cython back into requirements
# RUN pip install python_openzwave --no-deps --no-cache-dir --install-option="--flavor=shared"
Binary file added iot-devices/__pycache__/bootstrap.cpython-37.pyc
Binary file not shown.
Binary file added iot-devices/__pycache__/models.cpython-37.pyc
Binary file not shown.
Binary file added iot-devices/__pycache__/thing.cpython-37.pyc
Binary file not shown.
39 changes: 39 additions & 0 deletions iot-devices/bootstrap.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
from flask import Flask
from ddtrace import tracer, patch
patch(sqlalchemy=True,sqlite3=True,psycopg=True)
from models import Pump, db


# configure the tracer so that it reaches the Datadog Agent
# available in another container
tracer.configure(hostname='agent')

import os
DB_USERNAME = os.environ['POSTGRES_USER']
DB_PASSWORD = os.environ['POSTGRES_PASSWORD']


def create_app():
"""Create a Flask application"""
app = Flask(__name__)
#app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///app.db'
app.config['SQLALCHEMY_DATABASE_URI'] = 'postgresql://' + DB_USERNAME + ':' + DB_PASSWORD + '@' + 'db/' + DB_USERNAME
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False

db.init_app(app)
initialize_database(app, db)
return app


def initialize_database(app, db):
"""Drop and restore database in a consistent state"""
with app.app_context():
db.drop_all()
db.create_all()
first_pump = Pump('Pump 1', 'OFF', 5.1)
second_pump = Pump('Pump 2', 'OFF', 3002.1)
third_pump = Pump('Pump 3', 'ON', 5242.1)
db.session.add(first_pump)
db.session.add(second_pump)
db.session.add(third_pump)
db.session.commit()
Loading

0 comments on commit 7eb419b

Please sign in to comment.