rm argument list too long find and xargs

Christopher Browne cbbrowne-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org
Mon Apr 12 17:58:52 UTC 2010


On Mon, Apr 12, 2010 at 12:29 PM, teddymills <teddy-5sHjOODPK7E at public.gmane.org> wrote:
>
> I was trying to rm more than 1024 files at one time (old logs)
> and rm returned a strange error. "Argument list too long"
>
> so find and xargs to the rescue..
> find . -name 'myDEBUG*.log' | xargs rm
>
> http://en.wikipedia.org/wiki/Xargs
> uname -a shows a kernel less than 2.6.23

This isn't normally a kernel issue - it is normally an issue of how
your shell handles "globbing" / argument expansion.

It's quite common for shells to restrict the size of arguments to
something "not arbitrarily large."  It might be based on the size of a
memory buffer, so that long filenames would consume more space than
short ones.

In any case, the usual solutions involve running the command in some
way that doesn't force this to be expanded by the shell.

Usually this involves:

1.  Finding filenames using find, as that generates a list of files to
stdout, which is not restricted in size.

2.  Then, you need to invoke "rm" against each file, in some fashion.

There are generally two methods:

a)  find -name "pattern" | xargs rm

xargs is used to grab portions of a stream, and repeatedly invoke rm
against those portions.  This means that rm is only spawned a few
times, running against a list of several files.

b) You could have find expressly execute rm against each file

find -name "pattern" -exec rm '{}' \;

This spawns rm a bunch more times, because it does so for each file.
That might be worthwhile, mind you...
-- 
http://linuxfinances.info/info/linuxdistributions.html
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list