>The Y2K "crisis" might have been mitigated if more designers >had said, "Hey, pretty soon we're going to need 4-digit years. >Let's provide them now."
In fairness, there wasn't a Y2K "crisis" as much as it was Y2K procrastination. Everyone was aware of this problem from the mid-70's on, but there was always an excuse as to why it was premature, or not necessary, or how the system would be replaced by then, etc. etc. etc. It was not a designer or programmer problem. >And much of the anguish of 24-bit to 31-bit address conversion >might have been avoided if designers had thought to reserve >the top 8 bits of addresses instead of using them for flags. >Instead, many OS interfaces remain 24-bit constrained. Again, it's an easy argument to make when one is wallowing in terabytes of memory. Not so much when an application may have had 80K [or less] to run in. After all, it's amazing what one can see in hindsight. Yet, I seriously doubt that anyone would look at 64-bit addressing and say ... let's go to 128 bit, because we'll need it at some point in the future. Who would be willing to "reserve" more storage, modify architecture, and infrastructure for such a contingency? >How many programmers are still using 31-bit branch instructions >rather than 64 because z/OS doesn't support execution above >the bar? This year. There is no 31-bit versus 64-bit branch instruction. All addresses are interpreted based on the addressing mode and Branch Relative instructions don't use a base. Adam ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [email protected] with the message: INFO IBM-MAIN
