A simple backup tool which uploads encrypted files to S3, in batches.
Ideally, it should be setup to run in a crontab entry.
- Encrypts files locally with
gpg
- Uploads files to S3 in batches of customizable size
- Support for uploading a
tar
archive of files in certain folders, useful for sources with thousands of files (e.g. photo library) - Rescans sources at specific intervals to find new or updated files
- Removes files from S3 if they are removed locally
- OS: Linux, MacOS (untested)
- node.js 10+
awscli
1.8.6+ (for support ofSTANDARD_IA
storage class)find
gpg
tar
-
aws configure
-
yarn install --production
ORnpm install --production
-
cp config.sample.js config.default.js
-
Modify your new config file
-
Check your config file:
bin/backup-to-cloud --check-config
-
Try it out first with:
bin/backup-to-cloud --dry
-
Set up a crontab entry for it, for example:
- run every hour with verbose logging:
0 * * * * cd /path/to/this && ./bin/backup-to-cloud --verbose >> cron.log 2>&1
- run every 12 hours:
0 */12 * * * cd /path/to/this && ./bin/backup-to-cloud >> cron.log 2>&1
./bin/backup-to-cloud --help
./bin/backup-to-cloud --check-config
./bin/backup-to-cloud --dry
./bin/backup-to-cloud
Restore a file or folder and decrypt:
./bin/backup-restore --help
./bin/backup-restore --output OUTPUT_DIR_OR_FILE REMOTE_DIR_OR_FILE
Schedule a restore test:
0 1 * * * cd /path/to/this && ./bin/backup-restore --output TEMPORARY_DIR --test / >> restore-test.log 2>&1
Decrypt a downloaded encrypted file:
./bin/backup-decrypt --help
./bin/backup-decrypt --output OUTPUT_FILE INPUT_FILE
Verify that the DB and remote files are in sync:
./bin/backup-verify --help
./bin/backup-verify --dry
./bin/backup-verify
The DB format has switched from JSON to SQLite. To upgrade existing DB, run:
./bin/backup-upgrade-db