Question: I find mysqldump to be very slow for large databases/tables. I start getting "Linux is dangerously overloaded" type warnings. Also tried to do tar on the data tables, but then since the tables are live, start getting tar errors.
How do you handle this?
Hello Steve, when it comes to making back-ups, it is true that using mysqldump does require servers with CPU power to spare. I used to run it at 3am and it'll be done in an hour, and that was on a dual Opteron machine with a RAID 0 array. Within months the process eventually took more than 2 hours to complete, and I eventually had to stop dumping forum posts table and private messages table.
At that point, I had slave database servers to offload the database and act as a backup against hardware defects, but everything was still prone to human errors. If I accidentally deleted a table then there won't be a back up.
From that point on, to truly back up our InnoDB database means to shutdown mysql on the slave server and just making a copy of the database to another hard drive. It's quite a pain to manage, and at this point we know that soon we'll need some automated software which is going to be costly.
Personally I think that the most practical way to backup is to install a big hard drive on your database server and just straight up copy your database files into that hard drive for backup. From that point on, it's just a question of whether you have the CPU power or not. Run mysqldump if there's the CPU power to handle it, or your alternative is to shutdown the board while you run mysqldump, or shutdown mysql altogether and copy your database files over.
* When using mysqldump, be sure to use the "--opt" option.
Don't have much practical solutions at hand, I'll edit this post if I remember anything.