Backup Your MySQL Databases Daily

The average professionally done website nowadays is going to use a database as a backend. Often the websites will use the database to store user submitted content or other data that changes or is added to on a daily basis. You may have copies of your php scripts, images, and other files that create your site on your local hard drive or elsewhere, but unless you backup your MySQL databases, you'll lose all your database content in the event of a catastrophic failure of some sort.

The simplest way to do database backups is to create MySQL dumps either from the server shell or from a frontend such as phpMyAdmin and save them to your local computer or a remote server, this is fine, but you have to remember to do it, daily ideally, and that can be a lot of work, automated options are far better.

The goal of this article will be to illustrate how to do automated backups of your MySQL databases.

Daily Backups

You should do daily backups of your databases that include user submitted content that can change daily. So your forum, blog, or other such databases really need daily backups. Ecommerce databases should also be updated daily so you do not lose order or customer data.

To do daily backups I use a shell script executed with a cron job. I'd like to be able to tell you where I found this script, but all I remember is finding it in a forum many years ago and I have used it for so long I have forgotten who to attribute it so. So, if you created this script, I'm sorry for not giving you credit here.

Create a file called backup.sh and put the following within it:


#!/bin/bash
# Set a value that we can use for a datestamp
DATE=`date +%Y-%m-%d`                                                                                                                                       $
# Our Base backup directory
BASEBACKUP="/backup/daily"

for DATABASE in `cat /backup/db-list.txt`
do
        # This is where we throw our backups.
        FILEDIR="$BASEBACKUP/$DATABASE"

        # Test to see if our backup directory exists.
        # If not, create it.
        if [ ! -d $FILEDIR ]
        then
                mkdir -p $FILEDIR
        fi

        echo -n "Exporting database:  $DATABASE"
        mysqldump --user=root --opt $DATABASE | gzip -c -9 > $FILEDIR/$DATABASE-$DATE.sql.gz
        echo "      ......[ Done Exporting to local backup, now exporting for remote backup] "
        cp $FILEDIR/$DATABASE-$DATE.sql.gz /backup/uploads/$DATABASE.sql.gz
        echo "      .......[Done]"
done

# AutoPrune our backups.  This will find all files
# that are "MaxFileAge" days old and delete them.
MaxFileAge=4
find $BASEBACKUP -name '*.gz' -type f -mtime +$MaxFileAge -exec rm -f {} \;

Now, I run dedicated servers, so I have full access to the directory tree and am not restricted in the paths I can use. If you are on a shared server the bolded areas of the script will need editing.

Basically, the script pulls a list of databases to be backed up from a file called db-list.txt, this file takes 1 database name per line. It then exports the database and compresses, then saves it with a datastamped filename to a directory called /backup/daily/databasename/. Then, the script copies a non-datestamped filename to a directory called /backup/uploads/, thus overwriting the previous file stored there.

Finally it goes through the daily directory structure and deletes any files older than 4 days.

So, I end up with a directory for each daily database, and within each directory is zipped SQL dumps from each of the previous 4 days. Then I have a directory called uploads which has 1 copy of each database dump, the most recent.