A common task to do with backups is to rotate them , so that you have multiple copies of critical files going back x days. Below is a bash script (Linux command line) that does this.
Why Rotate Backups? Why not just one current backup file…
The primary reason to have a rotating backup system, is to create a form of versioning, albeit a very simple form, by having multiple copies of backups all taken at various intervals, you can effectively go back in time and recover potentially overwritten files, or files that have been maliciously corrupted. Having just one recent backup copy runs the risk that important files were accidentally overwritten or deleted and now the backup only contains the overwritten file not the original one. Having multiple copies taken at different points mitigates this..
Another approach to solve this problem is to combine directory zipping (archiving) and then committing the archive into a version control system like subversion or git. This way the version control system keeps a versioned copy. Of course beware versioning binaries can get quite space consuming very quickly.
There are also systems like Mac OS time machine, which when turned on effectively versions all file change(s) at the OS level, not sure if there;s an equivalent time machine for the pc or linux.
Read here about a related script that monitors and alerts about backups
Simple Rotate and backup script..
You can find the latest code here: [icon name=”github” class=”” unprefixed_class=””] https://github.com/acbrandao/scripts/tree/master/shell_backup_rotate
The script below performs a simple tar (zip) of a particular directory, then it tags the tar file with a particular date and finally moves it to a destination folder. Once at the destination folder it checks to see how many previous copies exist, when the ROTATE_PERIOD amount is reached (7,10,30, etc.. copies) it removes the oldest and so on , this way you always have x copies going back in time sequentially from the latest.
Usage is pretty straight forward: Use the -p argument to specify the rotation period, default is a 10 backups
$backuprot -p 5 /source_folder/ /destination_folder/
There’s an optional line that copies the latest to a separate file, I did this to easily identify the latest, but its not the best way especially for large tar files, you can comment these lines out if space is a concern.
#!/bin/bash ######################### # Backups to DESTINATION_FOLDER / Zips and performs basic rotation ########################## SOURCE_FOLDER="/source/" # source folder DESTINATION_FOLDER="/backup/" # mounted folder BASENAME="basename $SOURCE_FOLDER" ROTATE_PERIOD=10 # datestamp has a formatted date datestamp=`date +"%d-%m-%Y"` #### Display command usage ######## usage() { cat << EOF USAGE: backuprot [OPTIONS] /source_folder/ /destination_folder/ Back up and entire folder, creates tgz and , performs x day rotation of backups Must provide source anddestination folders OPTIONS: -p Specify Rotation period in days - default is $ROTATE_PERIOD EXAMPLES: backuprot -p 5 [/source_folder/] [/destination_folder/] EOF } #### Getopts ##### while getopts ":p:" opt; do case "$opt" in p) ROTATE_PERIOD=${OPTARG};; \?) echo "$OPTARG is an unknown option" usage exit 1 ;; esac done shift $((OPTIND-1)) if [ -z "$1" ] || [ -z "$2" ]; then usage else # Backup and gzip the directory SOURCE_FOLDER=$1 BASENAME=`basename "$SOURCE_FOLDER"` TGZFILE="$BASENAME-$datestamp.tgz" LATEST_FILE="$BASENAME-Latest.tgz" DESTINATION_FOLDER=$2 echo "\nStarting Backup and Rotate " echo "\n-----------------------------" echo "\nSource Folder : $SOURCE_FOLDER" echo "\nTarget Folder : $DESTINATION_FOLDER" echo "\nBackup file : $TGZFILE " echo "\n-----------------------------" if [ ! -d "$SOURCE_FOLDER" ] || [ ! -d "$DESTINATION_FOLDER" ] ; then echo "SOURCE ($SOURCE_FOLDER) or DESTINATION ($DESTINATION_FOLDER) folder doesn't exist/ or is misspelled, check & re-try." exit 0; fi echo "\nCreating $SOURCE_FOLDER/$TGZFILE ... " tar zcvf $SOURCE_FOLDER/$TGZFILE $SOURCE_FOLDER echo "\nCopying $SOURCE_FOLDER/$TGZFILE to $LATEST_FILE ... " cp $SOURCE_FOLDER/$TGZFILE $SOURCE_FOLDER/$LATEST_FILE echo "\nMoving $TGZFILE -- to --> $DESTINATION_FOLDER " mv $SOURCE_FOLDER/$TGZFILE $DESTINATION_FOLDER echo "\nMoving $LATEST_FILE -- to --> $DESTINATION_FOLDER " mv $SOURCE_FOLDER/$LATEST_FILE $DESTINATION_FOLDER # count the number of file(s) in the appropriate folder Rotate the logs, delete older than # ROTATE_PERIOD days, if their are at_least 7 backups FILE_COUNT=`find $DESTINATION_FOLDER -maxdepth 1 -type f | wc -l` echo "\n Rotation period $ROTATE_PERIOD for $DESTINATION_FOLDER " echo "\n $FILE_COUNT files found in $DESTINATION_FOLDER folder" echo "\n find $DESTINATION_FOLDER -mtime +$ROTATE_PERIOD" echo "\n -----------------------------------" if [ $FILE_COUNT -gt $ROTATE_PERIOD ]; then echo "Removing backups older than $ROTATE_PERIOD in $DESTINATION_FOLDER" echo "Removing these old backup files..." find $DESTINATION_FOLDER -mtime +$ROTATE_PERIOD -exec rm {} \; else echo "Only $FILE_COUNT file, NOT removing older backups in $DESTINATION_FOLDER " fi fi echo "----------------" echo "Backup_rot Complete. :" echo "to extract file >> tar -xzvf $TGZFILE "
Thanks for this. Got it running right now, and it all appears to be working just like it should. Of course it’s got to chew through about 20 gigs of files before it’s done.
Yeah, that’s a pretty big file to go through, you might to consider version control system like Git /Subversion to handle incremental backups that size. If your interested I may do a posting soon on how to tie git into an incremental backup system.
Thanks for the script, works great.
How do i run in as cron?
First familiarize yourself with editing a corn file: http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
then from the console:
> crontab -e
the add to the end of the cronlist a command like this.. adjust it accordingly to meet your needs, example below runs a script every hour on the hour.
0 * * * * backuprot -p 5 /source_folder/ /destination_folder/
| | | | |
| | | | —– Day of week (0 – 7) (Sunday=0 or 7)
| | | ——- Month (1 – 12)
| | ——— Day of month (1 – 31)
| ———– Hour (0 – 23)
————- Minute (0 – 59)
Save the edited cron entires.
This part didn’t work on my Ubuntu18.04 system:
while getopts “:p ” opt; do
I changed it to
while getopts “:p:” opt; do
Thanks, I’ll be putting this code up on Github shortly, check https://github.com/acbrandao/scripts for updates
This is also a very good backup scrip: https://github.com/ccztux/glsysbackup
you make the tgz inside the source folder
this makes the tgz be part of the folder and is included in backup.
needs to be fixed
Dino, thanks can you suggest and alternate and I’ll fix it
Thanks, you can use file exclusion list to omit .tgz or other extension from being re-packaged
/home/pi/backuprot.sh: line 35: syntax error near unexpected token `&’
/home/pi/backuprot.sh: line 35: ` ‘