[GTALUG] example of why RISC was a good idea

Lennart Sorensen lsorense at csclub.uwaterloo.ca
Sun May 22 12:07:12 EDT 2016


On Sat, May 21, 2016 at 04:01:03PM -0400, D. Hugh Redelmeier wrote:
> Technically, that was called (horizontal) microcode.
> 
> With WCS, a customer could sweat bullets and perhaps get an important
> performance improvement.  It wasn't easy.  Perhaps that is similar to
> the way GPUs can be used very effectively for some computations.
> 
> My opinions:
> 
> Microcode made sense when circuits were significantly faster than core
> memory and there was no cache: several microcode instructions could
> be "covered" by the time it took to fetch a word from core.
> 
> Microcode can still make sense but only for infrequent things or for
> powerful microcode where one micro-instruction does just about all the
> work of one macro-instruction.  Even with these considerations, it
> tends to make the pipeline longer and thus the cost of branches higher.
> 
> The big thing about RISC was that it got rid of microcode.  At just
> the right time -- when caches and semiconductor memory were coming
> onstream.  Of course UNIX was required because it was the only popular
> portable OS.
> 
> The idea of leaving (static) scheduling to the compiler instead of
> (dynamic) scheduling in the hardware is important but not quite right.
> Many things are not known until the actual opperations are done.  For
> example, is a memory fetch going to hit the cache or not?  I think
> that this is what killed the Itanium project.  I think that both kinds
> of scheduling are needed.
> 
> CISC losses: the Instruction Fetch Unit and the Instruction Decoder
> are complex and potential bottlenecks (they add to pipeline stages).
> CISC instruction sets live *way* past their best-before date.
> 
> RISC losses: instructions are usually less dense.  More memory is consumed.
> More cache (and perhaps memory) bandwidth is consumed too.
> Instruction sets are not allowed to change as quickly as the
> underlaying hardware so the instruction set is not as transparent as
> it should be.
> 
> x86 almost vanquished RISC.  No RISC worksations remain.  On servers,
> RISC has retreated a lot.  SPARC and Power don't seem to be growing.
> But from out in left field, ARM seems to be eating x86's lunch.  ATOM, x86's
> champion, has been cancelled (at least as a brand).

But x86 internally has been RISC since the Pentium Pro with an instruction
decoder to convert the x86 instructions to the internal format.  Now I
suspect it might not be pure RISC, but then again neither is ARM or
probably anyone else these days.

-- 
Len Sorensen


More information about the talk mailing list