Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chore/fly deploy #30

Merged
merged 9 commits into from
Dec 22, 2024
175 changes: 175 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,175 @@
# flyctl launch added from .gitignore
# Created by https://www.toptal.com/developers/gitignore/api/flask,python,visualstudiocode
# Edit at https://www.toptal.com/developers/gitignore?templates=flask,python,visualstudiocode

### Deployment Providers ###
**/.vercel

### Flask ###
**/instance/*
!**/instance/.gitignore
**/.webassets-cache
**/.env

### Flask.Python Stack ###
# Byte-compiled / optimized / DLL files
**/__pycache__
**/*.py[cod]
**/*$py.class

# Distribution / packaging
**/.Python
**/build
**/develop-eggs
**/dist
**/downloads
**/eggs
**/.eggs
**/lib
**/lib64
**/parts
**/sdist
**/var
**/wheels
**/share/python-wheels
**/*.egg-info
**/.installed.cfg
**/*.egg
**/MANIFEST
**/*.lock
# allow poetry.lock
!**/poetry.lock

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
**/*.manifest
**/*.spec

# Installer logs
**/pip-log.txt
**/pip-delete-this-directory.txt

# Unit test / coverage reports
**/htmlcov
**/.tox
**/.nox
**/.coverage
**/.coverage.*
**/.cache
**/nosetests.xml
**/coverage.xml
**/*.cover
**/*.py,cover
**/.hypothesis
**/.pytest_cache
**/cover

# Translations
**/*.mo
**/*.pot

# Django stuff:
**/*.log
**/local_settings.py
**/db.sqlite3
**/db.sqlite3-journal

# Flask stuff:
**/instance

# Scrapy stuff:
**/.scrapy

# Sphinx documentation
**/docs/_build

# PyBuilder
**/.pybuilder
**/target

# Jupyter Notebook
**/.ipynb_checkpoints

# IPython
**/profile_default
**/ipython_config.py

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
**/__pypackages__

# Celery stuff
**/celerybeat-schedule
**/celerybeat.pid

# SageMath parsed files
**/*.sage.py

# Environments
**/.venv
**/.env
**/venv
**/.direnv
**/.envrc

# Spyder project settings
**/.spyderproject
**/.spyproject

# Rope project settings
**/.ropeproject

# mkdocs documentation
site

# mypy
**/.mypy_cache
**/.dmypy.json
**/dmypy.json

# Pyre type checker
**/.pyre

# pytype static type analyzer
**/.pytype

# Cython debug symbols
**/cython_debug

### VisualStudioCode ###
**/.vscode/*
!**/.vscode/settings.json
!**/.vscode/tasks.json
!**/.vscode/launch.json
!**/.vscode/extensions.json
!**/.vscode/*.code-snippets

# Local History for Visual Studio Code
**/.history

# Built Visual Studio Code Extensions
**/*.vsix

### VisualStudioCode Patch ###
# Ignore all local history of files
**/.history
**/.ionide

# Support for Project snippet scope
**/.vscode/*.code-snippets

# Ignore code-workspaces
**/*.code-workspace

# Snyk vuln scanner
**/.dccache

# Config file
**/config.ini

# End of https://www.toptal.com/developers/gitignore/api/flask,python,visualstudiocode

# flyctl launch added from .mypy_cache/.gitignore
# Automatically created by mypy
.mypy_cache/**/*
fly.toml
18 changes: 18 additions & 0 deletions .github/workflows/fly-deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# See https://fly.io/docs/app-guides/continuous-deployment-with-github-actions/

name: Fly Deploy
on:
push:
branches:
- main
jobs:
deploy:
name: Deploy app
runs-on: ubuntu-latest
concurrency: deploy-group # optional: ensure only one action runs at a time
steps:
- uses: actions/checkout@v4
- uses: superfly/flyctl-actions/setup-flyctl@master
- run: flyctl deploy --remote-only
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
Comment on lines +10 to +18

Check warning

Code scanning / CodeQL

Workflow does not contain permissions Medium

Actions Job or Workflow does not set permissions
89 changes: 69 additions & 20 deletions bm_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,17 @@

# Handle graceful shutdown on Ctrl+C or SIGTERM
def signal_handler(sig, frame):
"""Handles system signal interrupts for graceful application shutdown.

This function is designed to intercept system signals and perform a clean disconnection from the socket connection before terminating the application. It ensures that resources are properly released and the application exits smoothly.

Args:
sig (int): The signal number received.
frame (frame): The current stack frame.

Returns:
None
"""
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Error handling has been removed from the push_pushover function. Network operations should always include error handling to prevent silent failures.

Consider restoring the try-except block to catch and log potential network errors, ensuring reliable notification delivery tracking.

logging.info("Shutting down gracefully...")
sio.disconnect()
sys.exit(0)
Expand All @@ -73,18 +84,26 @@ def signal_handler(sig, frame):

# Send push notification via Pushover. Disabled if not configured in config.py
def push_pushover(msg):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider restoring the error handling and correcting the docstring in the push_pushover and push_dapnet functions.

The removal of error handling makes the code more fragile and inconsistent with the rest of the codebase. Additionally, the push_pushover() docstring is incorrect. Suggested changes:

def push_pushover(msg):
    """Sends a push notification via the Pushover service.

    Args:
        msg (str): The message content to be sent.

    Returns:
        None
    """
    try:
        conn = http.client.HTTPSConnection("api.pushover.net:443")
        conn.request("POST", "/1/messages.json",
            urllib.parse.urlencode({
            "token": cfg.pushover_token,
            "user": cfg.pushover_user,
            "message": msg,
            }), { "Content-type": "application/x-www-form-urlencoded" })
        conn.getresponse()
        logging.info("Pushover notification sent.")
    except Exception as e:
        logging.error(f"Failed to send Pushover notification: {e}")

def push_dapnet(msg):
    """Sends a pager notification via the DAPNET service.

    Args:
        msg (str): The message content to be sent.

    Returns:
        None
    """
    try:
        dapnet_json = json.dumps({
            "text": msg,
            "callSignNames": cfg.dapnet_callsigns,
            "transmitterGroupNames": [cfg.dapnet_txgroup],
            "emergency": True
        })
        response = requests.post(
            cfg.dapnet_url,
            data=dapnet_json,
            auth=HTTPBasicAuth(cfg.dapnet_user, cfg.dapnet_pass)
        )
        logging.info("DAPNET notification sent.")
    except Exception as e:
        logging.error(f"Failed to send DAPNET notification: {e}")

These changes:

  1. Restore error handling to maintain consistency with push_discord()
  2. Fix the incorrect docstring in push_pushover()
  3. Add success logging to match error handling
  4. Improve code formatting in push_dapnet()

try:
conn = http.client.HTTPSConnection("api.pushover.net:443")
conn.request("POST", "/1/messages.json",
urllib.parse.urlencode({
"token": cfg.pushover_token,
"user": cfg.pushover_user,
"message": msg,
}), { "Content-type": "application/x-www-form-urlencoded" })
conn.getresponse()
logging.info("Pushover notification sent.")
except Exception as e:
logging.error(f"Failed to send Pushover notification: {e}")
"""Sends a notification to a Discord channel or thread via webhook.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (typo): The docstring for push_pushover incorrectly describes Discord webhook functionality instead of Pushover notifications.

Suggested implementation:

    """Sends a notification via Pushover service.

    This function sends a message through the Pushover notification service using their HTTP API.
    It establishes a secure connection to api.pushover.net to deliver the notification.

    Args:
        msg (str): The message content to be sent.
        token (str): The Pushover application token.
        user (str): The user/group key identifying the recipient(s).

    Returns:
        None
    """

Note: I've assumed the most common Pushover API parameters (token and user) based on the standard Pushover API usage. If the function signature is different, you may need to adjust the Args section of the docstring to match the actual parameters used in your implementation.


This function sends a message to a specified Discord webhook URL, with optional support for posting to a specific thread. It utilizes the discord-webhook library to execute the webhook request.

Args:
wh_url (str): The Discord webhook URL.
msg (str): The message content to be sent.
thread_id (str, optional): The thread ID for posting to a specific thread. Defaults to None.

Returns:
None
"""
conn = http.client.HTTPSConnection("api.pushover.net:443")
conn.request("POST", "/1/messages.json",
urllib.parse.urlencode({
"token": cfg.pushover_token,
"user": cfg.pushover_user,
"message": msg,
}), { "Content-type": "application/x-www-form-urlencoded" })
conn.getresponse()

# Send notification to Discord Channel or Thread via webhook
def push_discord(wh_url, msg, thread_id=None):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Error handling has been removed from the push_dapnet function. Network requests should include error handling.

Restore the try-except block to properly handle and log potential network failures when sending DAPNET notifications.

Expand All @@ -105,15 +124,38 @@ def push_discord(wh_url, msg, thread_id=None):

# Send pager notification via DAPNET. Disabled if not configured in config.py
def push_dapnet(msg):
try:
dapnet_json = json.dumps({"text": msg, "callSignNames": cfg.dapnet_callsigns, "transmitterGroupNames": [cfg.dapnet_txgroup], "emergency": True})
response = requests.post(cfg.dapnet_url, data=dapnet_json, auth=HTTPBasicAuth(cfg.dapnet_user,cfg.dapnet_pass))
logging.info("DAPNET notification sent.")
except Exception as e:
logging.error(f"Failed to send DAPNET notification: {e}")
"""Sends a pager notification via the DAPNET (Digital Amateur Paging Network) service.

This function sends an emergency text message to specified DAPNET callsigns and transmitter groups using the configured DAPNET credentials. It prepares a JSON payload and submits a POST request to the DAPNET API.

Args:
msg (str): The message content to be sent via DAPNET.

Returns:
None
"""
dapnet_json = json.dumps({"text": msg, "callSignNames": cfg.dapnet_callsigns, "transmitterGroupNames": [cfg.dapnet_txgroup], "emergency": True})
response = requests.post(cfg.dapnet_url, data=dapnet_json, auth=HTTPBasicAuth(cfg.dapnet_user,cfg.dapnet_pass))

# Construct the message to be sent
def construct_message(c):
"""Constructs a formatted message string for a transmission event.

This function generates a human-readable description of a radio transmission, including details about the source, destination, time, and duration. It provides a comprehensive text representation of a communication event.

Args:
c (dict): A dictionary containing transmission details with keys including:
- DestinationID
- Stop
- Start
- TalkerAlias
- SourceCall
- SourceName
- DestinationName

Returns:
str: A formatted message string describing the transmission event.
"""
tg = c["DestinationID"]
out = ""
duration = c["Stop"] - c["Start"]
Expand Down Expand Up @@ -141,9 +183,16 @@ def connect():

@sio.on("mqtt")
def on_mqtt(data):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (code-quality): Low code quality found in on_mqtt - 7% (low-code-quality)


ExplanationThe quality score for this function is below the quality threshold of 25%.
This score is a combination of the method length, cognitive complexity and working memory.

How can you solve this?

It might be worth refactoring this function to make it shorter and more readable.

  • Reduce the function length by extracting pieces of functionality out into
    their own functions. This is the most important thing you can do - ideally a
    function should be less than 10 lines.
  • Reduce nesting, perhaps by introducing guard clauses to return early.
  • Ensure that variables are tightly scoped, so that code using related concepts
    sits together within the function rather than being scattered.

if cfg.verbose and isinstance(data['payload'], dict) and data['payload'].get('DestinationID') in cfg.talkgroups:
logging.debug(f"Filtered MQTT event: Event={data['payload'].get('Event', 'Unknown')} DestinationID={data['payload'].get('DestinationID')} SourceCall={data['payload'].get('SourceCall')}")
"""Processes MQTT messages from the Brandmeister network and manages notification logic.

This function handles incoming MQTT messages by evaluating communication events, tracking activity across talkgroups and callsigns, and triggering notifications based on configured criteria. It determines whether a communication event meets the notification requirements and dispatches messages through configured notification channels.

Args:
data (dict): A dictionary containing the MQTT payload with communication event details.

Returns:
None
"""
call = json.loads(data['payload'])

tg = call["DestinationID"]
Expand Down
20 changes: 20 additions & 0 deletions fly.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# fly.toml app configuration file generated for bm-monitor on 2024-12-21T13:56:20-06:00
#
# See https://fly.io/docs/reference/configuration/ for information about how to use this file.
#

app = 'bm-monitor'
primary_region = 'dfw'

[build]

[http_service]
internal_port = 8080
force_https = true
auto_stop_machines = 'stop'
auto_start_machines = true
min_machines_running = 0
processes = ['app']

[[vm]]
size = 'shared-cpu-1x'
Loading