This is the back-end of the SnapperGPS web application that takes signal snapshots from a PostgreSQL database and puts back location estimates. It is basically an extended version of the snapshot-gnss-algorithms repository.
The code in this repository is split into two parts:
First, the directory core is basically identical to the snapshot-gnss-algorithms repository and contains the core snapshot GNSS algorithms presented in
Jonas Beuchert and Alex Rogers. 2021. SnapperGPS: Algorithms for Energy-Efficient Low-Cost Location Estimation Using GNSS Signal Snapshots. In SenSys ’21: ACM Conference on Embedded Networked Sensor Systems, November, 2021, Coimbra, Portugal. ACM, New York, NY, USA, 13 pages. https://doi.org/10.1145/3485730.3485931.
For details on the algorithms, please refer to the respective repository and this open-access publication.
Second, the directory web_app/processing contains an additional layer of code to interface with these algorithms. There are two top-level Python scripts. First, the script maintain_navigation_data.py
downloads satellite navigation data to the directory web_app/processing/navigation_data. It updates the local navigation data every 15 min using a server of the BKG or a server of the NASA as source. Note that the navigation data is pre-processed to reduce the file size and accelerate reading and then stored in NumPy's .npy
format seperately as a 2D array for every day and satellite system (GPS - G, Galileo - E, BeiDou - C). The files are named year_day_gnss.npy
. Processing data from a certain day requires having satellite navigation data available for this day. The second important Python script is process_queue.py
, which handles the location estimation and calls functions from the directory mentioned first. While it is usually sufficient to run a single instance of the navigation data script, you can run multiple instances of the processing script to parallelise the processing of datasets and, thus, to accelerate the processing if the server has sufficient compute ressources. Each instance checks the PostgreSQL database every 5 s for uploads with Status == waiting
and processes the oldest of such uploads, if at least one is present and satellite navigation data is available on the server. Status
is first set to processing
and finally to complete
when all snapshots have been turned into location estimates and have been entered into the database. The script also handles user notifications via e-mail, push notifcations, or Telegram messages.
For instructions how to set-up the PostgreSQL database, please see the snappergps-app repository. The most straight-forward way to populate this database with raw data from your SnapperGPS receiver is to host your own version of the SnapperGPS app and to point it to this database.
On a bare-bone server, it might be necessary to install some or all of the following packages at first: the PostgreSQL Library (libpq-dev
) to communicate with the PostgreSQL database backend, the Geospatial Data Abstraction Library (libgdal-dev
) to handle geospatial data formats, a terminal multiplexer (tmux
) to access and control multiple terminals on the server, a command line tool for transferring data with URL syntax (curl
) to fetch navigation data from the internet, Python 3.7 (python3.7
, python3.7-dev
), including virtual environments (python3.7-venv
), although, any other Python 3.X might work, too, and a package installer for Python (pip
). On a Debian-based system, they can be installed with the following commands, although, Python can be installed via Anaconda or Miniconda, too:
sudo apt install libpq-dev
sudo apt install libgdal-dev
sudo apt install tmux
sudo apt install curl
sudo apt install python3.7
sudo apt install python3.7-dev
sudo apt install python3.7-venv
curl -o get-pip.py https://bootstrap.pypa.io/get-pip.py
python3.7 get-pip.py
Tested with Python 3.7 on Ubuntu 16.04.
- Clone repository (
git clone [email protected]:SnapperGPS/snappergps-backend.git
orgit clone https://github.com/SnapperGPS/snappergps-backend.git
). - Complete config.py with the information about your SQL database. If you also want e-mail notifications, push notfications, and/or Telegram notifications to work, then complete the respective sections, too.
- If you want to use the NASA (the default) as source for the satellite navigation data and not the BKG, then create an account on urs.earthdata.nasa.gov and enter the login details in netrc.txt. Alternatively, open maintain_navigation_data.py and change the line
bkg = False
intobkg = True
. - Create and activate virtual environment snappergps_env (
python3.7 -m venv snappergps_env
andsource snappergps_env/bin/activate
). - Install requirements (
python3.7 -m pip install -r snappergps-backend/requirements.txt
). - Optionally, install mkl-fft for faster acquisition. (Might not work depending on the server hardware.)
- Create Tmux session nav (
tmux new -s nav
ortmux -S /path/to/socket new -s nav
). Tmux is required to keep processes running after logging out of the server. If this is not required, then this step can be skipped. - In session nav, activate virtual environment snappergps_env (
source snappergps_env/bin/activate
). - Run the navigation data script (
cd snappergps-backend/web_app/processing/
andpython3.7 maintain_navigation_data.py
). Note that this will only download navigation data from two days ago onward. To get historic data, runpython3.7 maintain_navigation_data.py --past-days 7
instead. This will download all data from seven days ago onward. (Or from whatever number of days you choose.) You will not be able to process any snapshots from days where you have not downloaded navigation data first. - Leave Tmux session (
Ctrl
+b
d
). - Create Tmux session proc0 (
tmux new -s proc0
ortmux -S /path/to/socket new -s proc0
). Optionally, create proc1, proc2,... - In session proc0, activate virtual environment snappergps_env (
source snappergps_env/bin/activate
). - Run the processing script (
cd snapshot-gnss-backend/web_app/processing/
andpython3.7 process_queue.py
orpython3.7 process_queue.py --no-telegram-bot
). Only one instance shall run the Telegram bot at any time. All other instances shall be started with the--no-telegram-bot
flag. Optionally, run the script in proc1, proc2,..., too. - Optionally, set the maximum number of snapshots that the acquisition processes in parallel with the
--max-batch-size
command line argument (e.g.,python3.7 process_queue.py --max-batch-size 12
). The default value is 10. For optimal execution speed choose the value such that the RAM of the platform is reasonably filled, but not overfilled.
tmux -S /path/to/socket list-sessions
to show all sessions.tmux -S /path/to/socket attach -t proc42
to attach to the session named proc42.tmux -S /path/to/socket new -s proc42
to start a new session named proc42.Ctrl
+b
d
to detach from a session.tmux -S /path/to/socket kill-session -t proc42
to kill the session named proc42.
Jonas Beuchert and Alex Rogers are based in the Department of Computer Science of the University of Oxford.
Jonas Beuchert is funded by the EPSRC Centre for Doctoral Training in Autonomous Intelligent Machines and Systems (University of Oxford Project Code: DFT00350-DF03.01, UKRI/EPSRC Grant Reference: EP/S024050/1) and works on SnapperGPS as part of his doctoral studies.
This documentation is licensed under a Creative Commons Attribution 4.0 International License.