I use mysqldump to extract the data from my database. I have a shell script scheduled in cron. I have 7 generations (in theory, I have 8 when I look right now) of backups, which are done on Saturday just after midnight.
I have 3000+ users, 61K+ posts, and my gzipped backup file is 25,073,125 bytes. When I remember to do so, I download the files to my own workstation. I'm in the process of automating that part as well.
The entire backup process takes less than 1 minute to run. Here's part of my backup script... since the "keep 7 only" doesn't seem to be working at the moment, I won't post it...
Code: Select all
# disable board
# set path and db name variables
# do the backup
/usr/local/mysql/bin/mysqldump -h localhost -u USERNAME --password=PASSWORD $dbname > $dbpath/$dbname/dump.sql 2>&1
# create new backup filename
# zip, and copy the output
mv $output_file.gz $filename
chmod 644 $dbpath/$dbname/*
# enable board
The purpose of the call to "date" at the beginning and the end is to track how long the process takes. The perl script board_status.pl takes the board offline during the backup, and returns it to online after the backup is done.
The reason for setting parameters for the dbname and path is that eventually I will use this script on several boards. So I want one backup script for all of them.
The most important step
in any back up process is to TEST YOUR BACKUP!!!
I used to work for a software company that sold a back up product, and you can't imagine how many times people assumed that everything was going well because the back up was running. But they never tried to restore.
So at your first (and on some repetitive basis) chance, test your back up. Make a new forum, and restore your data. Test it out. If it works, do it again about once a quarter, or after major mods, or upgrades, or whatever. But test. 8)