I nearly had a problem just now, having done a clean install (rather than an upgrade) of xubuntu 11.10 I needed to restore sundry mysql databases. Since they're not in /home nor in /etc I thought for a nasty moment I hadn't got backups because /var/lib isn't in my automatic cron backups.
However I had done a one-off copy of the whole of /var before cleaning out the old installation and installing xubuntu 11.10 - phew! :-)
It does make me want to find a sensible way of backing up those mysql databases though. My automatic (cron) incremental backups save /home and /etc. I have made /var/www a symbolic link to a 'home' directory for www-data (the apache UID) at /home/www-data/www so all my web stuff gets backed up too.
The databases are not heavily used so there's a pretty good chance that I can get away with making straight copies of all the files in /var/lib/mysql without causing any disasters but what's the best way to do that? I *could* make /var/lib/mysql a link to somewhere that gets backed up but that really feels like a bit of a bodge. I'd prefer not to add /var/lib/mysql to my backup list though I suppose that's really the sanest way to do it.
I use sqlite for as many databases as possible so that this isn't an issue but some applications insist on mysql (or postgres) and, even though my use is very lightweight and doesn't need a proper *database* backup strategy I need to do something.
Any comments/ideas anyone?
On 04 Jan 16:14, Chris Green wrote:
I nearly had a problem just now, having done a clean install (rather than an upgrade) of xubuntu 11.10 I needed to restore sundry mysql databases. Since they're not in /home nor in /etc I thought for a nasty moment I hadn't got backups because /var/lib isn't in my automatic cron backups.
However I had done a one-off copy of the whole of /var before cleaning out the old installation and installing xubuntu 11.10 - phew! :-)
It does make me want to find a sensible way of backing up those mysql databases though. My automatic (cron) incremental backups save /home and /etc. I have made /var/www a symbolic link to a 'home' directory for www-data (the apache UID) at /home/www-data/www so all my web stuff gets backed up too.
The databases are not heavily used so there's a pretty good chance that I can get away with making straight copies of all the files in /var/lib/mysql without causing any disasters but what's the best way to do that? I *could* make /var/lib/mysql a link to somewhere that gets backed up but that really feels like a bit of a bodge. I'd prefer not to add /var/lib/mysql to my backup list though I suppose that's really the sanest way to do it.
I use sqlite for as many databases as possible so that this isn't an issue but some applications insist on mysql (or postgres) and, even though my use is very lightweight and doesn't need a proper *database* backup strategy I need to do something.
Any comments/ideas anyone?
The ondisk format of mysql is not guaranteed to work between architectures, if you've not got much data in it, create a cron job that does a dump of each of the databases in to files, backup the dump directory, job done.
At Wed, 4 Jan 2012 16:14:07 +0000, Chris Green wrote:
It does make me want to find a sensible way of backing up those mysql databases though. My automatic (cron) incremental backups save /home and /etc. I have made /var/www a symbolic link to a 'home' directory for www-data (the apache UID) at /home/www-data/www so all my web stuff gets backed up too.
Any comments/ideas anyone?
Prefer mysqldump over copying /var/lib/mysql/foo/*. Add suitable mysqldump calls to your cron job, sending output to somewhere under /home.
Best, Richard
On 2012-01-04 16:14, Chris Green wrote:
I nearly had a problem just now, having done a clean install (rather than an upgrade) of xubuntu 11.10 I needed to restore sundry mysql databases. Since they're not in /home nor in /etc I thought for a nasty moment I hadn't got backups because /var/lib isn't in my automatic cron backups.
However I had done a one-off copy of the whole of /var before cleaning out the old installation and installing xubuntu 11.10 - phew! :-)
It does make me want to find a sensible way of backing up those mysql databases though. My automatic (cron) incremental backups save /home and /etc. I have made /var/www a symbolic link to a 'home' directory for www-data (the apache UID) at /home/www-data/www so all my web stuff gets backed up too.
The databases are not heavily used so there's a pretty good chance that I can get away with making straight copies of all the files in /var/lib/mysql without causing any disasters but what's the best way to do that? I *could* make /var/lib/mysql a link to somewhere that gets backed up but that really feels like a bit of a bodge. I'd prefer not to add /var/lib/mysql to my backup list though I suppose that's really the sanest way to do it.
I use sqlite for as many databases as possible so that this isn't an issue but some applications insist on mysql (or postgres) and, even though my use is very lightweight and doesn't need a proper *database* backup strategy I need to do something.
Any comments/ideas anyone?
The mysqldump program does a pretty good job of placing all your mysql databases (including user accounts, passwords and all your data) into a file which you can later restore by piping it into the mysql command-line client.
Something like this would do the trick if you aren't worried about encryption or anything like that (I'd suggest you investigate storing the password for this somewhere other than in your script, however):
================================= Begin Script ========================================= #!/bin/bash
mysqluser=root mysqlpass=P@ssw0rd mysqldump=`which mysqldump` outfile=/home/backup/backup.sql gzip=`which gzip`
if [ -x $mysqldump ] then echo "Performing backup of mysql database as user $mysqluser into file $outfile" mysqldump --user=$mysqluser --password=$mysqlpass --all-databases > $outfile if [ -e $outfile ] then if [ -x $gzip ] then $gzip $outfile echo "Backup completed and compressed." else echo "Could not compress file - plain text version remains at $outfile" fi else echo "Backup appears to have failed - no file found at $outfile" fi else echo "Could not find mysqldump" fi ================================== End Script ==========================================
Hope this helps,
Jim
On Wed, Jan 04, 2012 at 04:46:30PM +0000, Jim Rippon wrote:
The mysqldump program does a pretty good job of placing all your mysql databases (including user accounts, passwords and all your data) into a file which you can later restore by piping it into the mysql command-line client.
Sounds a reasonable idea, reading the mysqldump man page suggests mysqlhotcopy for faster backup when running on the server (which it will be) but speed isn't very relevant for me, they're all relatively tiny databases.
Thanks everyone.
On 04/01/12 16:46, Jim Rippon wrote:
Something like this would do the trick if you aren't worried about encryption or anything like that (I'd suggest you investigate storing the password for this somewhere other than in your script, however):
I would suggest creating a backup user so that you're not needing the root user at all: grant select, lock tables on *.* to backup@localhost identified by 'backup'; then mysqluser=backup mysqlpass=backup
Depending on the type of data, you may prefer to remove the "lock tables" privilege and add "--skip-lock-tables" to the mysqldump command; personally I prefer not to have my backup script lock my tables and would prefer to deal with the consequences of any inconsistent data on restore (which would be minimal with the types of data I do this with).