Steve Fosdick wrote:
Backing up the files that comprise a database is generally OK if the database is not up and running at the time. By that I mean all the database server processes have been shut down, not just that there are no active transactions.
My fault for not being clear on that one; I intend to use mysqldump (or mysqlhotcopy) to backup the database; the part I'm stuck on is working out which databases have changed first. Some of the databases are CMS-type packages that only change when their content does, so they need backing up but if done every day they waste space for no benefit.
You can run find -type f to find files that have been changed and then work out which directory they are in so as to back up that whole directory. I don't know how deep MySQL directories go so you may need an algorithm to be able to work up from a file to the top level directory.
They only go 1 level (ie the db files are in a directory with the name of the database).
You could simply loop through the database/directories:
for db in /var/lib/mysql/* do if [ `find $db -type f -mtime 1` ] then # Backup $db or add to list to be backed up. fi fi
That should be what I need, thanks.
You may also want to change '-mtime 1' to '-newer stamp-file' where stamp file is created or 'touched' at the start of each backup so instead of testing if the database has changed in the last 24 hours you are testing if it has changed since the last backup.
Good idea, I had thought about that as being desirable but not got to the point of thinking how to do it.