On 29 Aug 09:55, Sagr wrote:
As ALUGers may be aware; a friend of mine recently had his website defaced by hackers. This has made me realise how seldomly I, myself, actually check my own website to see if it is still as it should be. Bearing in mind my own website is quite static, only being updated once a month or so, checking it would seem an ideal task for automation.
I would guess I am not the only person on the planet who doesn’t have time to regularly check their website for defacement and was wondering if any experienced ALUGers could recommend a ready made script which I could run on my home server which would enable me to compare my live website (on a remote commercial webhost) with a stored image of it (on my local server) a couple of times a day.
Do other ALUGers bother to check their own websites for defacement regularly? If so what tools do you use?
I'd just cron a script like the following:
--- Begin Script --- #!/bin/bash
site=http://www.sommitrealweird.co.uk/ md5sumfile=/home/brettp/.sommitrealweird.md5
# First run, just setup the file if [ ! -e "$md5sumfile" ]; then dog --no-header "$site" | md5sum > $md5sumfile exit 0 fi
echo $(< "$md5sumfile") echo $(dog --no-header "$site" | md5sum)
if [ "x$(dog --no-header "$site" | md5sum)" != "x$(< $md5sumfile)" ]; then echo "Page changed" exit 1 fi --- End Script ---
That was just knocked up as an example, but it does work ;)
dog is cat + http and ftp and various things, it's really quite handy to have knocking about, apt-get install dog!
Cheers,