The rest of the project is setup using cron jobs.
Firstly, you need to set a cron job that will activate your backup.sh script
10 4 * * * sh / backup/backup.sh
That is mine, it runs at 4:10 in the morning every day.
Next you'll want a cron job set to upload the files in the upload folder.
10 6 * * * /usr/local/bin/ncftpput -Ef /home/admin/ncftpputlogin / /backup/uploads/*
This line, running at 6:10 in the morning every day, tells a program called ncftpput to following the instructions in the file called "ncftpputlogin" and upload the files found in /backup/uploads to a different server. The "ncftpputlogin" file contains the server & login information to accomplish that. If you do not have ncftp installed on your server, it is easy to install, or you can use another FTP program.
In addition to my daily backups of frequently updated databases, I do monthly backups of all databases on the server. This is much easier to setup, you can do it all with cron jobs.
Simply create the following cron jobs.
10 7 1 * * cp /backup/monthly/book2-alldatabases.sql.gz /backup/monthly/book2-alldatabases.sql.gz.old
30 7 1 * * mysqldump --opt --all-databases | gzip -c > /backup/monthly/book2-alldatabases.sql.gz
10 7 2 * * /usr/local/bin/ncftpput -Ef /home/admin/ncftpputlogin / /backup/monthly/book2-alldatabases.sql.gz
So, at 7:10 in the morning on the first day of the month a cron job renames an older SQL dump.
Then, at 7:30 in the morning on the first day of the month, a cron job creates a new database dump, overwriting the old one that was just renamed.
Then, at 7:10 in the morning the next day, the second day of the month, a cron job uploads that most recent database dump to a remote server.
So, I end up with 2 full database dumps stored on the local server, and one stored remotely.
If your databases get large these backup processes can take a good deal of time and server resources, possibly even slowing your sites down. So you should plan to do them at low traffic times and space them out, also always make sure you give enough time between creating a database dump and uploading it so that your server has time to finish the SQL export completely. You can easily edit these scripts as well to upload files to multiple different locations if you like. If you own more than one dedicated server you can easily have one FTP backups to the other. You can also sign up for backup data web hosting which is relatively barebones hosting that just provides lots of space for file storage. Even places like Amazon are offering file storage.
The important thing is you'll want to make sure your files are stored in geographically different locations. In addition to your backups guarding against things like hard drive failure, you have to also consider guarding against outside things such as natural or man made disasters which could take down entire buildings or cities.