> Maybe we're not understanding what is meant by "microcode"... The only CPU
I've designed was a 4-bit system that didn't use microcode to get it's work
done (it was for a class), so I can't claim direct experience, but I at
least thought I knew what the word microcode implied... a level of
abstraction, if you will, between the opcodes and the actual hardware... a
reduced instruction set sitting there that could take some of those complex
codes and break it down into the fundamental instructions.


Modern processor architectures are more about datapaths than microcode.  The
microcode they DO use is far more obtuse than the old 2900 style bitslice
architectures (which, IIRC, ran at maybe 4-10Mhz peak, and took 4-8
microcycles for a single CISC type 'instruction', I wrote tons of that stuff
too)

in fact, the pentiums since P2 at least have a sort of microcode, this is
used to break the more complex instructions down into 'microops' which are
all executed at the various stages of the execution pipeline.  Simple
instructions remain a single cycle per stage.  This 'microcode' can be field
updated by intel via BIOS upgrades, they do this primarily to fix bugs in
particular versions.  Most of the fixes run considerably slower than the
original target instruction.



_________________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to