On Mon, 29 Dec 2014 09:26:44 -0500, Shmuel Metz (Seymour J.) <[email protected]> wrote: >That's not what you folks wrote when you introduced the term; >technical articles described a hierarchy of microcode, millicode and >z, with the millicode using an extended subset of the z instruction >set and the microcode using an undocumented architecture, presumably >VLIW (horizontal).
Yes, lots of articles, and at the end of the day, you still have no idea how the machine you have is implemented. What runs where changes depending on the design target for the machine, ease of implementation, performance sensitivity, and things as mundane or esoteric as chip real estate or the date. So while there is a difference in the execution environment of microcode and millicode, I don't find it a useful distinction except insofar that I know what engineering group to talk to about something. I have seen programmers get burned by making rash assumptions about implementation and trying to micromanage (or outwit?) the microcode, so to speak. Don't get me wrong. A machine that was designed to let you change the underlying hardware design without altering the programming architecture of the machine was very smart. And way cool. And a major move forward in the industry. There's a reason the architecture has survived for 50 years with those old 24-bit applications still running just fine thank-you-very-much. But my experience within IBM is that that we love talking about technology. In fact, we sometime love it a bit too much, particularly when it's new and it hasn't completed its evolutionary journey. Alan Altmark IBM ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [email protected] with the message: INFO IBM-MAIN
