IPhone soz about top posting
Did the find work well on finding duplicate files, I have allot of picture files that I need to sort out and delete duplicates.
Regards Ian Porter
www : www.codingfriends.com
On 18 Jun 2010, at 12:49, (Ted Harding) Ted.Harding@manchester.ac.uk wrote:
On 18-Jun-10 11:05:45, Brett Parker wrote:
On 18 Jun 11:31, Ted Harding wrote:
On 18-Jun-10 09:42:47, Mark Rogers wrote:
On 18/06/10 10:36, MJ Ray wrote:
I'd be looking at scripting something with find, its -exec option, md5sum, sort, mv and maybe ln if I want to keep the filenames, but I've not looked at fslint.
The more I think about it the more I am thinking that I should go down this route. It will be a one-off process, so if it takes overnight to run, it doesn't really matter, and that way I can control what it does.
However, if a standard tool (like fslint) *can* do it, then I think it's the kind of tool I should learn to use. After all, I could write a script to find the files without using "find", but it's a good tool for that job and I'm glad I have learnt to use it.
--
Possbly useful in locating the duplicates (indeed using 'find') may be:
for i in `find . -type f -print` ; do ls -lgG --block-size=1 $i | awk -v F=$i '{S=$3};{print S " " $5 " " F}' ; done | sort -n
Why not:
find . -type f -printf "%10s %AY-%Am-%Ad %AH:%AM %p\n" | sort -n
Less pipes, less to go wrong, no need for the awk...
Ta,
Brett Parker
That's neat! However, it seems to depend on whatever "%p" comes out as, and I can't find any documentation about it! Where should I look?
Ted.
E-Mail: (Ted Harding) Ted.Harding@manchester.ac.uk Fax-to-email: +44 (0)870 094 0861 Date: 18-Jun-10 Time: 12:49:13 ------------------------------ XFMail ------------------------------
main@lists.alug.org.uk http://www.alug.org.uk/ http://lists.alug.org.uk/mailman/listinfo/main Unsubscribe? See message headers or the web site above!