Skip to content
This repository has been archived by the owner on May 4, 2021. It is now read-only.

Backup script does not have the right environment when running with CRON_TIME #23

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,10 +48,10 @@ RESTORE=false

`dockup` will use your AWS credentials to create a new bucket with name as per the environment variable `S3_BUCKET_NAME`, or if not defined, using the default name `docker-backups.example.com`. The paths in `PATHS_TO_BACKUP` will be tarballed, gzipped, time-stamped and uploaded to the S3 bucket.

If you want `dockup` to run as a cron task, you can set the environment variable `CRON_TIME` to the desired frequency, for example `CRON_TIME=0 0 * * *` to backup every day at midnight.

## Restore
To restore your data simply set the `RESTORE` environment variable to `true` - this will restore the latest backup from S3 to your volume.

To restore your data simply set the `RESTORE` environment variable to `true` - this will restore the latest backup from S3 to your volume. If you want to restore a specific backup instead of the last one, you can also set the environment variable `LAST_BACKUP` to the desired tarball name.

## A note on Buckets

Expand Down
6 changes: 4 additions & 2 deletions restore.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
#!/bin/bash

# Find last backup file
: ${LAST_BACKUP:=$(aws s3 ls s3://$S3_BUCKET_NAME | awk -F " " '{print $4}' | grep ^$BACKUP_NAME | sort -r | head -n1)}
if [ -n "${LAST_BACKUP}" ]; then
# Find last backup file
: ${LAST_BACKUP:=$(aws s3 ls s3://$S3_BUCKET_NAME | awk -F " " '{print $4}' | grep ^$BACKUP_NAME | sort -r | head -n1)}
fi

# Download backup from S3
aws s3 cp s3://$S3_BUCKET_NAME/$LAST_BACKUP $LAST_BACKUP
Expand Down
19 changes: 10 additions & 9 deletions run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,13 @@
if [[ "$RESTORE" == "true" ]]; then
./restore.sh
else
./backup.sh
fi

if [ -n "$CRON_TIME" ]; then
echo "${CRON_TIME} /backup.sh >> /dockup.log 2>&1" > /crontab.conf
crontab /crontab.conf
echo "=> Running dockup backups as a cronjob for ${CRON_TIME}"
exec cron -f
fi
if [ -n "$CRON_TIME" ]; then
env | grep -v 'affinity:container' | sed -e 's/^\([^=]*\)=\(.*\)/export \1="\2"/' > /env.conf # Save current environment
echo "${CRON_TIME} . /env.conf && /backup.sh >> /dockup.log 2>&1" > /crontab.conf
crontab /crontab.conf
echo "=> Running dockup backups as a cronjob for ${CRON_TIME}"
exec cron -f
else
./backup.sh
fi
fi