Skip to content

Commit

Permalink
feat: use azcopy for databases backups (eduNEXT#51)
Browse files Browse the repository at this point in the history
* feat: install azcopy

* feat: add new variables and conditionals for storage services

* fix: include custom storage endpoint inside s3 conditional

* feat: add variables and azcopy command

* fix: update variable names

* fix: storage system names

* fix: default s3 value

* fix: default s3 value

* fix: error in readme

* update backup system variable name

* fix: azure-blob conditional
  • Loading branch information
marbonilla authored Nov 16, 2023
1 parent 816cfb0 commit 03881f2
Show file tree
Hide file tree
Showing 5 changed files with 42 additions and 3 deletions.
6 changes: 6 additions & 0 deletions drydock_backups/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,12 @@ RUN apt-get update && apt-get install -y \
awscli \
&& rm -rf /var/lib/apt/lists/*

RUN wget -O /tmp/azcopy.tar.gz https://aka.ms/downloadazcopy-v10-linux \
&& tar -xvf /tmp/azcopy.tar.gz -C /tmp \
&& cp /tmp/azcopy_linux_amd64*/azcopy /usr/bin/ \
&& chmod +x /usr/bin/azcopy \
&& rm -rf /tmp/azcopy*

RUN useradd -m backupuser
RUN echo "backupuser ALL=(ALL) NOPASSWD: /usr/bin/mysql, /usr/bin/mongodump" >> /etc/sudoers

Expand Down
4 changes: 4 additions & 0 deletions drydock_backups/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,15 @@ Configuration variables

- **BACKUP_IMAGE**: The image used to run the cronjob. (default: `ednxops/shipyard-utils:v1.0.0`)
- **BACKUP_CRON_SCHEDULE**: Cron schedule to run the backup. (default: `0 2 * * *`)
- **BACKUP_STORAGE_SERVICE**: Storage service to use. (default: `aws-s3`) (options: `aws-s3`, `azure-blob`)
- **BACKUP_AWS_ACCESS_KEY**: AWS access key to access the bucket or minIO user.
- **BACKUP_AWS_SECRET_KEY**: AWS secret key to access the bucket or minIO password.
- **BACKUP_BUCKET_NAME**: Name of the bucket where the backups will be stored.
- **BACKUP_BUCKET_PATH**: Path inside the bucket where the backups will be stored.
- **BACKUP_CUSTOM_STORAGE_ENDPOINT**: Custom endpoint to access the bucket. (default: `None`)
- **BACKUP_AZURE_CONTAINER_NAME**: Name of the container where the backups will be stored.
- **BACKUP_AZURE_ACCOUNT_NAME**: Name of the account to access the container.
- **BACKUP_AZURE_CONTAINER_SAS_TOKEN**: SAS token to access the container.
- **BACKUP_K8S_USE_EPHEMERAL_VOLUMES**: Use ephemeral volumes to set up the cronjob. (default: `False`)
- **BACKUP_K8S_EPHEMERAL_VOLUME_SIZE**: Size of the ephemeral volume. (default: `8Gi`)
- **BACKUP_MYSQL_USERNAME**: Username to access the mysql database. (default: `{{ MYSQL_ROOT_USERNAME }}`)
Expand Down
13 changes: 10 additions & 3 deletions drydock_backups/docker-entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,15 @@ else
exit 1
fi

if [[ -z "${BACKUP_CUSTOM_STORAGE_ENDPOINT}" ]]; then
aws s3 mv $FILENAME s3://$S3_BUCKET_NAME/$BUCKET_PATH/$1/
if [ "$BACKUP_STORAGE_SERVICE" = 'aws-s3' ]; then
if [[ -z "${BACKUP_CUSTOM_STORAGE_ENDPOINT}" ]]; then
aws s3 mv $FILENAME s3://$S3_BUCKET_NAME/$BUCKET_PATH/$1/
else
aws --endpoint-url $BACKUP_CUSTOM_STORAGE_ENDPOINT s3 mv $FILENAME s3://$S3_BUCKET_NAME/$BUCKET_PATH/$1/
fi
elif [ "$BACKUP_STORAGE_SERVICE" = 'azure-blob' ]; then
azcopy cp $FILENAME "https://${AZURE_STORAGE_ACCOUNT}.blob.core.windows.net/${AZURE_STORAGE_CONTAINER_NAME}/$BUCKET_PATH/${1}/$FILENAME?${AZURE_STORAGE_SAS_TOKEN}"
else
aws --endpoint-url $BACKUP_CUSTOM_STORAGE_ENDPOINT s3 mv $FILENAME s3://$S3_BUCKET_NAME/$BUCKET_PATH/$1/
echo "Unknown storage service"
exit 1
fi
18 changes: 18 additions & 0 deletions drydock_backups/patches/drydock-multipurpose-jobs
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ spec:
value: '{{ BACKUP_MYSQL_USERNAME }}'
- name: MYSQL_ROOT_PASSWORD
value: '{{ BACKUP_MYSQL_PASSWORD }}'
{% if BACKUP_STORAGE_SERVICE == "aws-s3" %}
- name: AWS_ACCESS_KEY_ID
value: '{{ BACKUP_AWS_ACCESS_KEY }}'
- name: AWS_SECRET_ACCESS_KEY
Expand All @@ -41,6 +42,14 @@ spec:
value: '{{ BACKUP_BUCKET_NAME }}'
- name: BUCKET_PATH
value: '{{ BACKUP_BUCKET_PATH }}'
{% elif BACKUP_STORAGE_SERVICE == "azure-blob" %}
- name: AZURE_CONTAINER_NAME
value: '{{ BACKUP_AZURE_CONTAINER_NAME }}'
- name: AZURE_CONTAINER_SAS_TOKEN
value: '{{ BACKUP_AZURE_CONTAINER_SAS_TOKEN }}'
- name: AZURE_ACCOUNT_NAME
value: '{{ BACKUP_AZURE_ACCOUNT_NAME }}'
{% endif %}
{% if BACKUP_CUSTOM_STORAGE_ENDPOINT %}
- name: BACKUP_CUSTOM_STORAGE_ENDPOINT
value: '{{ BACKUP_CUSTOM_STORAGE_ENDPOINT }}'
Expand Down Expand Up @@ -101,6 +110,7 @@ spec:
value: '{{ BACKUP_MONGO_PASSWORD }}'
- name: MONGODB_DATABASES
value: '{{ BACKUP_MONGODB_DATABASE }}'
{% if BACKUP_STORAGE_SERVICE == "aws-s3" %}
- name: AWS_ACCESS_KEY_ID
value: '{{ BACKUP_AWS_ACCESS_KEY }}'
- name: AWS_SECRET_ACCESS_KEY
Expand All @@ -113,6 +123,14 @@ spec:
- name: BACKUP_CUSTOM_STORAGE_ENDPOINT
value: '{{ BACKUP_CUSTOM_STORAGE_ENDPOINT }}'
{% endif %}
{% elif BACKUP_STORAGE_SERVICE == "azure-blob" %}
- name: AZURE_CONTAINER_NAME
value: '{{ BACKUP_AZURE_CONTAINER_NAME }}'
- name: AZURE_CONTAINER_SAS_TOKEN
value: '{{ BACKUP_AZURE_CONTAINER_SAS_TOKEN }}'
- name: AZURE_ACCOUNT_NAME
value: '{{ BACKUP_AZURE_ACCOUNT_NAME }}'
{% endif %}
{% if BACKUP_K8S_USE_EPHEMERAL_VOLUMES %}
volumeMounts:
- mountPath: /data/
Expand Down
4 changes: 4 additions & 0 deletions drydock_backups/plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,14 @@
"VERSION": __version__,
"IMAGE": "ednxops/shipyard-utils:v1.1.0",
"CRON_SCHEDULE": '0 2 * * *',
"BACKUP_STORAGE_SERVICE": "aws-s3",
"AWS_ACCESS_KEY": "",
"AWS_SECRET_KEY": "",
"BUCKET_NAME": "",
"BUCKET_PATH": "backups",
"AZURE_CONTAINER_NAME": "",
"AZURE_CONTAINER_SAS_TOKEN": "",
"AZURE_ACCOUNT_NAME": "",
"CUSTOM_STORAGE_ENDPOINT": None,
"K8S_USE_EPHEMERAL_VOLUMES": False,
"K8S_EPHEMERAL_VOLUME_SIZE": "8Gi",
Expand Down

0 comments on commit 03881f2

Please sign in to comment.