Here this script basically automates the backup of github repository to local directory. It implements the Rotational backup strategy and pushes the backups to the Google Drive by using rclone. and it also send curl upon successful completion of backup process.
You can watch the video of basic scenarios that we face in real-time!.👇Just click on the below Thumbnail.
- Automated backups of project files from the Github Repository.
- Rotational backup strategy for customizable retention period such as, For daily, weekly and monthly backups.
- To storing backup files on Google drive by using rclone.
- According to the retention policy Automated deletes the older backups.
- Using curl request through sending notification of successful completion of backups.
Before using scripts, ensure that you have install some following necessary stuff.
Configuration Tools | Configuration Tools Description |
---|---|
1. Git |
To clone your github project. |
2. rclone |
To push backups to google drive. |
3. curl |
To sends notification of successful backups |
- Here we are installing rclone using following link,
- curl https://rclone.org/install.sh | sudo bash
- Here we are configuring rclone using Google Drive. Run the following command for Google Drive remote,
- Here romil(username):romil(groupname) and 755 that means user can read, write and can execute the file.
rclone config key option | rclone config Description of option |
---|---|
1. New remote |
Here we are enter n for new remote connection after that we add the name. |
2. Storage |
Here we are enter 17 for google drive storage. because it provides different variety of storage solution. |
3. Client id |
Simply press Enter it will select defualt. |
4. Client secret |
Simply press Enter it will select defualt. |
5. Scope |
Here we select the number 1 for full access all files. |
6. Secret account file |
Simply press Enter it will select defualt. |
7. Advanced config |
Simply press Enter it will select defualt because currently i don't need any advanced config. |
8. Use auto config |
Simply press y . |
After that click on the link and sign in. Now we are mount this connection on local drive. we simply write,
mkdir ~/mydirectory
sudo chmod 755 ~/mydirectory
rclone mount newgdrive_backup: ~/mydirectory
mkdir /home/DevOps/Github_local_repo/
if permission needed so adds the following permission,
sudo chown romil:romil /home/DevOps/backupwala
sudo chmod 755 /home/DevOps/backupwala
Cloning the repository
First of all, You should clone The Github repository to your local directory where the backup script create a backup.
git clone https://github.com/RomilMovaliya/DemoPractical.git /home/DevOps/Github_local_repo/
Before the run your script, you should assign the following permission.
sudo chmod +x ./backup.sh
Run backup.sh script with following parameter.
/home/Devops/Github_local_repo/
----> /path/to/local/project/home/Devops/backupwala
----> /path/to/backup/directory
find DIRECTORY_OF_BACKUP
: that means it search for directory that is matched by variable name.type -f
: that means we are finding file instead of directory.-name "*.zip"
: that specify filter for file who contains .zip extension.-mtime + $RETENTION_DAYS
: which filter if the file modification time is more than VARIABLE($RETENTION_DAYS) days ago.-exec rm {} /
: here it removes the file and {} replace with file name founded.
find "$DIRECTORY_OF_BACKUP"
: That Starts for search within the specified directory.-type f -name "*.zip
: It basically filter out the type should be file instead of directory and name must be end with .zip extension.-exec bash -c '...' \
: It executes the bash command for each founded file. and allows us to run custom command.$(basename {} .zip | cut -d"_" -f2)
: Here basename {} .zip that removes the .zip extension from the file and cut -d"_" -f2 that extract the date from the filename.$(date -d ... +%u)
: Here It convert the extracted date into the numeric day of a week.(ex, 1 -> monday, 7-> sunday)(( $(date -d ... +%u) == 7 ))
: It checks the day of a week is sunday then it gives true. if it was backuped on sunday.
find "$DIRECTORY_OF_BACKUP
: Same as before.-type f -name "*.zip"
: Same filtering.-exec bash -c '...' \;
: Executes a bash command for each file found.$(basename {} .zip | cut -d"_" -f2)
: Same extraction of the date part from the filename.$(date -d ... +%d)
: Converts the extracted date into the day of the month.(( $(date -d ... +%d) == 1 ))
: Checks if the day of the month is the 1st. This test returns true if the file is a backup from the 1st of the month.
- The first line deletes all .zip files older than a certain number of days specified by $RETENTION_DAYS.
- The second line identifies files where the date in the filename corresponds to a Sunday, but it doesn’t perform any deletion or retention action by itself.
- The third line identifies files where the date in the filename is the 1st of the month, but again, it doesn’t perform any retention or deletion action directly.
Here we can see the backups that basically store in the Google Drive.
Here we get notification using curl to the webhook's dashboard.
⏰ Here we can also schedule the Job using Cronjob