billions of files, ext3, reiser, and ls -aly
John Moniz
john.moniz-rieW9WUcm8FFJ04o6PK0Fg at public.gmane.org
Tue Jan 30 23:50:58 UTC 2007
Lennart Sorensen wrote:
>On Tue, Jan 30, 2007 at 03:20:19AM -0500, Chris F.A. Johnson wrote:
>
>
>> If the command is 'ls -al' you will not get a 'too many arguments'
>> error, because there are not too many arguments; there are none
>> besides the options. If you use 'ls -al *' you may, and it depends
>> on the system (glibc may be part of it) and how many arguments (and
>> possibly the maximum length of the arguments).
>>
>>
>
>The shell will have a command line limit usually. I think older
>versions of bash it was 32768 characters, but I think it is more like
>128000 now. Not sure. I almost never exceed it, and when I do I know
>how to use find and xargs. Almost certainly the limit depends on the
>libc, the version of the shell, and various other factors.
>
>--
>Len Sorensen
>
I just ran into a similar problem trying to delete a bunch of unwanted
files (cache for various programs mainly). In one case, there was a huge
quantity of files in the directory and a 'rm -f ./*' gave me an error of
too many arguments. I had to do it in chunks such as 'rm -f ./*.png'
etc, grouping as many files together as possible. Should I have done
this with a different command?
John.
--
The Toronto Linux Users Group. Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists
More information about the Legacy
mailing list