Duplicate file finding script

Christopher Browne cbbrowne-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org
Tue Sep 20 22:07:03 UTC 2005


On 9/20/05, Jason Shein <jason-xgs8i/e9EeWTtA8H5PvdGCwD8/FfD2ys at public.gmane.org> wrote:
> For those of you whose hard drives are cluttering up with possibly duplicate
> files, try this little script out.
> 
> It will recursively MD5sum all files in a directory and output to a file
> called rem-duplicates.sh. Open this file in an editor, and un-comment the
> file(s) you would like to be removed. After this run the script and all
> un-commented files will be removed.

An alternative thing to do would be to create hard links (assuming the
files are on the same filesystem).

That would mean that the files would be treated as though they were
simultaneously in each of the locations where they are accessible.

This would allow you, for instance, to have an email message that
resides in multiple folders simultaneously, and wouldn't take up any
extra disk space (save for the directory entries, of course!).

-- 
http://www3.sympatico.ca/cbbrowne/linux.html
"The true  measure of a  man is how he treats  someone who can  do him
absolutely no good." -- Samuel Johnson, lexicographer (1709-1784)
--
The Toronto Linux Users Group.      Meetings: http://tlug.ss.org
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://tlug.ss.org/subscribe.shtml





More information about the Legacy mailing list