In one of my previous posts: MySQL backups, I talked about using a script for automating backups. I show that we can use gzip to compress backup file to compress and save. Since then, our backup file has been growing meg or two a day which is causing our backup files to get bigger and bigger every day. We keep hourly backups for a week so you can imagine space usage is quite high.
I went and did few tests to see if its beneficial for us to use Bzip2 instead of gzip. I could’ve tried 7z also but that is not something installed on most linux machines I work on and I didn’t want use a solution which will require me to add more software. Here are the results:
Original sql file size: 584M
Using gzip: 166M
Time spent on compressing 1 minute 25 secs
Using bzip2: 125M
Time spent on compressing 1 minute 50 secs
As we can see that it takes longer to compress but file size is much smaller (adds up fast for multiple backups). As our database grows bigger and bigger, the size difference will matter quite a bit. Since we also ftp the file off the server to onsite/offsite location hourly as well.
Does anybody know of any backup techniques which we can use for free and we can do incrementals with? comments are always welcome.Â To learn more about gzip or bzip2, see man gzip or man bzip2 respectively.