billions of files, ext3, reiser, and ls -aly
Chris F.A. Johnson
cfaj-uVmiyxGBW52XDw4h08c5KA at public.gmane.org
Wed Jan 31 00:33:46 UTC 2007
On Tue, 30 Jan 2007, John Macdonald wrote:
> On Tue, Jan 30, 2007 at 06:50:58PM -0500, John Moniz wrote:
> > I just ran into a similar problem trying to delete a bunch of unwanted
> > files (cache for various programs mainly). In one case, there was a huge
> > quantity of files in the directory and a 'rm -f ./*' gave me an error of
> > too many arguments. I had to do it in chunks such as 'rm -f ./*.png'
> > etc, grouping as many files together as possible. Should I have done
> > this with a different command?
>
> Using:
>
> find . -print | xargs rm
In case there may be problems with filenames (spaces, etc.) use:
find . -print0 | xargs -0 rm -f
> would do it in one shot (as long as you really want to
> delete everything in the current directory and there are
> no sub-directories - there are other variants to deal with
> those cases).
If there are subdirectories whose contents you don't want to
delete:
find . -maxdepth 1 -print0 | xargs -0 rm -f
--
Chris F.A. Johnson <http://cfaj.freeshell.org>
===================================================================
Author:
Shell Scripting Recipes: A Problem-Solution Approach (2005, Apress)
--
The Toronto Linux Users Group. Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists
More information about the Legacy
mailing list