Slowing Linux to a crawl

Chris F.A. Johnson cfaj-uVmiyxGBW52XDw4h08c5KA at public.gmane.org
Fri Jul 31 03:48:07 UTC 2009


On Thu, 30 Jul 2009, D. Hugh Redelmeier wrote:

> | From: Giles Orr <gilesorr-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org>
> 
> | echo $(($(find -maxdepth 1 -mindepth 1 -type f -printf '%k+' ; echo 0)))KiB
> 
> I was wondering what would happen if the size got to be "too big".
> The world has a habit of not planning for growth.
> 
> It appears as if BASH uses 64-bit evaluation, so there appears to be
> no immediate danger.
> 
> Too bad that BASH silently accepts overflow of these fixed-width
> integers.  A further problem is the manual makes no promise concerning
> the width of these intermediate results.
> 
> Using a pipe to bc or dc has a couple of advantages:
> 
> - there surely would not be a size limit, even in the future
> 
> - a pipe is unbounded but a command line probably is not.  This
>   matters if there are a *lot* of files.

    When using commands internal to the shell, the command line is
    limited only by available memory.


-- 
   Chris F.A. Johnson, webmaster         <http://woodbine-gerrard.com>
   ===================================================================
   Author:
   Shell Scripting Recipes: A Problem-Solution Approach (2005, Apress)
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list