billions of files, ext3, reiser, and ls -aly
Chris F.A. Johnson
cfaj-uVmiyxGBW52XDw4h08c5KA at public.gmane.org
Wed Jan 31 00:31:46 UTC 2007
On Tue, 30 Jan 2007, John Moniz wrote:
> Lennart Sorensen wrote:
>
>> On Tue, Jan 30, 2007 at 03:20:19AM -0500, Chris F.A. Johnson wrote:
>>
>> > If the command is 'ls -al' you will not get a 'too many arguments'
>> > error, because there are not too many arguments; there are none
>> > besides the options. If you use 'ls -al *' you may, and it depends
>> > on the system (glibc may be part of it) and how many arguments (and
>> > possibly the maximum length of the arguments).
>>
>> The shell will have a command line limit usually. I think older
>> versions of bash it was 32768 characters, but I think it is more like
>> 128000 now. Not sure. I almost never exceed it, and when I do I know
>> how to use find and xargs. Almost certainly the limit depends on the
>> libc, the version of the shell, and various other factors.
>
> I just ran into a similar problem trying to delete a bunch of unwanted files
> (cache for various programs mainly). In one case, there was a huge quantity
> of files in the directory and a 'rm -f ./*' gave me an error of too many
> arguments. I had to do it in chunks such as 'rm -f ./*.png' etc, grouping as
> many files together as possible. Should I have done this with a different
> command?
There are various ways to do it. If the files are sensibly named
(no spaces or other pathological characters), it it relatively
easy, e.g.:
xargs rm < <(ls .)
printf "%s\n" * | xargs rm ## printf is a bash builtin
--
Chris F.A. Johnson <http://cfaj.freeshell.org>
===================================================================
Author:
Shell Scripting Recipes: A Problem-Solution Approach (2005, Apress)
--
The Toronto Linux Users Group. Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists
More information about the Legacy
mailing list