Sorry, replied not replied-all. I don't do much email ok?

---------- Forwarded message ----------
On Wed, Feb 4, 2009 at 1:26 PM, Srdjan Todorovic <todorovic.s@googlemail.com> wrote:
Hi

2009/2/4 Chris G <cl@isbd.net>:
> I still don't think it addresses the problem, unless it's an
> incremental backup.  If someone breaks into 'my' machine (the machine
> being backed up) then they can send rubbish data to overwrite the good
> backups can't they?  This is the specific possibility I'm trying to
> protect myself against a bit.

This is the old replication argument - Replication produces hot backups for failure, not backups for bad data. (and also other benefits..)
 

Now that I remember, there was an article in one of the Linux Format
issues where you could use Subversion to hold incremental, versioned,
logged home directories.

Anyone tried it in general? Anyone tried it specifically for this kind
of backup setup?
(or anyone tried other version control software (git?) for this?)

Me, I used a combination of SVN and Tortoise on windows to do "backups" of all my files for quite a long time. The system is perfect for a "smallish number of files" - that is my entire OS has something like 50k files in it, I wouldnt want to commit the lot to SVN ;)

The idea is perfect because if I got a virus, that destroyed some files, that would then get automatically backed up, I could just run back to a particular date.

I'm not sure about the efficiency of this backup system though. If you have a program that makes large changes to "binary working files" then you would probably end up in troubles.


Interesting thought. One problem would be that it's not automated (you
would have to write a log)...

Not too difficult - I had a few shell scripts on cron and windows schedule that ran.
 

You can do ssh+svn:// so there might be a possibility of using ssh
keys if you really wanted.

Which helps on the automatic scripting.