Utility for finding duplicate files?

William Park opengeometry-FFYn/CNdgSA at public.gmane.org
Mon Jun 21 03:33:36 UTC 2010


On Sun, Jun 20, 2010 at 06:31:30PM -0400, Walter Dnes wrote:
>   Last week, after my main machine's hard drive started making ominous
> noises, I copied over just about all data from the machine.  There's a
> ton of duplication with the major backups on my backup USB drive.  I
> could do something like...
> 
> #!/bin/bash
> for file1 in *
> do
> if diff -q ${file1} ../dir2/${file1}; then
>   echo "rm ../dir2/${file1}" >> removelist
> fi
> done
> 
> ...and then source removelist
> 
>   Is there a utility program already written that can generate a list of
> duplicate files?

1. find . -type f | xargs ...
   find . -type f -exec ...

2. rsync -n ... 

3. diff -q . ../dir2

-- 
William

--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list