Linus on Gnome 3.2

D. Hugh Redelmeier hugh-pmF8o41NoarQT0dZR+AlfA at public.gmane.org
Sun Dec 4 05:09:35 UTC 2011


| From: Howard Gibson <hgibson-MwcKTmeKVNQ at public.gmane.org>

|    have an old computer that, probably, will not run the latest version 
|    of Microsoft Windows.  They are curious about Linux.  The old Gnome 
|    was a good solution for these people, and it provided enough eye 
|    candy to keep power users happy.  I am not sure what to tell people 
|    now.  Definitely, they should install KDE, if or nothing else to get 
|    KDM, which can be configured to not display the user list.  XFCE is a 
|    good low powered window manager, but I do not know how well the Linux 
|    community supports it.  It uses a lot of old, now-obsolete Gnome 
|    tools.

I would really like to know where the fat is.  Apparently not enough
to cause me to do systematic research.  There's a bit in
<http://www.linuxsymposium.org/archives/OLS/Reprints-2008/redelmeier-reprint.pdf>

Anecdotes:

I have a reasonable older AMD64 Small Form Factor PC in my pile of
spares.  I decided to drop Ubuntu 11.10 on it to loan it to someone.
It was horrible -- disk bound for a lot of stuff.  I fixed that by
upping the RAM from 512MiB to 1024MiB (rendering my other spare
RAMless and useless).  That's ridiculous.

Note: RAM is cheap.  I bought 8GiB for $30 today.  But DDR3 won't work
with boxes that require DDR1.


The first box I bought to run Linux had 64MiB of RAM.  That was
serious overkill.


When I decided to run X on my Sun 3 system (instead of SunView or
whatever it was called), I thought it was prudent (not required) to
upgrade to 8MiB of RAM.  Even then I thought X to be excessively large
and I said (immortalized in Henry's signature):
  The average pointer, statistically, points somewhere in X.
<http://yarchive.net/explosives/xenon_compounds.html>


The first widely used GUI system was the Mac.  It came with 128KiB
(not MiB or GiB) of RAM, a significant chunk of which was frame
buffer.  We all thought the RAM was too light; 512KiB was good.


The Xerox Alto only had 128KiB of memory if I remember correctly.


I know that newer systems do more and that there is no reason to be
seriously stingy with memory so cheap, but what are we burning this
memory on?  In my history with X-based-systems, the memory
requirements have gone up by a factor of almost 1000 -- truly mind
boggling.

One partial answer is caching.  That is completely justifiable given
how slowly disk speed is scaling compared with CPU speed.  The other
side of that coin is that with caching, programmers are sloppier about
how much file access is being done because it is masked by caching on
the machines they use.

Example: when I first used UNIX, we kept our PATH lists short to cut
down on the time the shell searched for a command.  Nobody does that
any more.  When I type a bum command into bash these days, there is a
noticeable delay as (I think) it consults a package manager to see if
it can guess what I need to install the make my typo work.

There's a parallel example.  People tell me that systems implemented
in Java are big and bloated and slow.  But I can never get out of them
why.  My best guess is that the systems are so intricate that nobody
understands them well enough to keep them simple.  But that
explanation based on is superstition, not science.
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list