The main advantage of a pre-loaded OS...

Lennart Sorensen lsorense-1wCw9BSqJbv44Nm34jS7GywD8/FfD2ys at public.gmane.org
Mon Feb 28 16:47:51 UTC 2011


On Fri, Feb 25, 2011 at 08:16:03PM -0500, D. Hugh Redelmeier wrote:
> I know very little about that world.  Intel's Poulsbo chip for the
> Atom (available for years) does have 3d acceleration (I think).  We
> Linux folks hate it because there is no decent open source driver.
> That's because Intel licensed the PowerVR SGX 535 graphics core and
> didn't get the right to disclose specs of it.
> 
> I understand that there are a number of graphics cores used in the
> non-PC space, that most or all of them are closed, and we rarely hear
> about them.  ATI was in this space but sold off that part of their
> business.
> 
> Only a small percentage of Atoms are sold for systems with IONs (but I
> have some).  I blame Intel.

Well adding the ION chipset probably increases the production cost by
$20 or so, which at the price point the atom devices are at, is a lot.
And the power consumption probably goes up a bit, which reduces battery
life (although greatly increases the usefulness of the device for video
and graphics).

> In the ION's space, ATI's new Fusion stuff looks interesting.  See
> this netbook, for example (it has been cheaper):
> <http://www.bestbuy.ca/en-CA/product/acer-acer-aspire-10-1-netbook-featuring-amd-processor-c-50-ao522-bz499-black-ao522-bz499/10161870.aspx?path=b0e001c0749c2a838407ee8a0e4a01e9en02>

Yes the fusion looks interesting, but I still dread the ATI drivers
being involved.  I hope they are getting better at it.

> I've never understood the making money side of this stuff.  nVidia and
> ATI have rarely made much money if I remember correctly.

Well they spend a lot on research and competing with each other.  If one
of them went away, high end graphics would get expensive and/or stop
improving at the current rate.

> Capturing geek mindshare certainly isn't strongly related to making
> money.
> 
> (Just a guess: Apple was probably a customer that PowerPC vendors
> should have fired.  I bet the cost money to the vendors.)

Well originally turning the power architecture into a single chip CPU
was a joint project between IBM, Motorola and Apple.  Of course apple was
using mostly Motorola made 601, 603 and 604 chips at the start, and later
other motorola PPC chips.  IBM went more high end because they wanted
the chips for what the power architecture was originally for, the RS/6000
workstations, and later the AS/400 systems (now the i series I believe).
Even the z series mainframes are based on the powerpc chips now (although
they are not powerpc machines as such, at least not yet).  Apple did
eventually use IBM chips in the G5 (IBM's 970 series), but complained
they ran too hot and would never go in a laptop.  Motorola spun off the
chip business as freescale, which now sells piles of embedded powerpc
chips for various uses (Given Ciscos are powerpc boxes, I imagine they
are all freescale chip based).  Of course IBM has made other powerpc
chips besides the ones they use themselves, such as the one in the
Nintendo Gabecube, the Wii, the Xbox 360, and of course working with
Toshiba and Sony, the Cell in the PS3, although IBM has dropped use of
the Cell, and Sony just took over ownership of the production of the
Cell processor.

-- 
Len Sorensen
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list