Tony Harminc wrote:
>That seems a little, uh, optimistic. The Java Programming Language
>book, and the corresponding Java Virtual Machine Specification, first
>editions, were both published in 1996.

No, not optimistic. Mere fact. Sun Microsystems made Java 1.0 generally
available for download on January 23, 1996, for the Windows 95, Windows NT,
and Solaris operating systems (three different operating systems across two
different processor architectures). That was over two decades ago.

http://web.archive.org/web/20080205101616/http://www.sun.com/smi/Press/sunflash/1996-01/sunflash.960123.10561.xml

I was not even counting Sun's "alpha" and "beta" release timeline, but some
would.

Java isn't the earliest example of a bytecode-to-native technology. IBM's
Technology Independent Machine Interface (TIMI) proved the concept much
earlier. TIMI traces its roots to the IBM System/38 introduced in August,
1979. The latest TIMI technologies are found in the IBM i platform, another
platform deservedly famous as a stable, reliable home for applications.
Before program execution (just before, if necessary) TIMI automatically
translates the bytecode to native code tailored for that particular
hardware model environment, persists it, and runs it. TIMI is arguably even
closer to ABO conceptually than Java is.

Another analog to ABO is the technology that Transitive Corporation (a
company IBM acquired) developed. Apple's Mac OS X 10.4 ("Tiger") for Intel
X86 introduced "Rosetta," a translation technology. Rosetta took Mac OS X
applications compiled for PowerPC and translated them, at program load, to
X86 instructions for execution on Apple's then new Intel-based Macintoshes.
It worked extremely well. Apple removed Rosetta from Mac OS X 10.7 some
years later, after developers had recompiled their applications to ship X86
versions. For a time IBM offered PowerVM Lx86 for IBM Power Servers, a
Transitive technology that performed the reverse translation (X86 to
Power).

There are many well proven technologies that parallel ABO. Java, TIMI,
Rosetta, and PowerVM Lx86 are four examples.

With all that said, one has to be smart about when, where, how, and how
much to test. Bill Woodger reminded me of an important fact, that if you're
not smart about testing and you're just running "big" tests over and over
to tick checkboxes, even when it makes little or no sense, then you're not
actually testing well either. You're very likely missing genuine risks.
Literally nobody wins if you've got an expensive, bloated testing regime
that isn't actually testing what you should be testing in 2016 (and
beyond). There are also many cases when business users simply throw up
their hands and declare "No thank you; I can't afford to wait 68 days for
your testing program to complete," and they shop elsewhere for their IT
services.

My friends and colleagues, we must always be *reasonable* and adaptable,
else we're lost at sea. And it's simply unreasonable to treat ABO
optimizations the same for testing purposes as source code changes and
recompiles. The source code doesn't change! The correct ABO testing effort
answer should be something more than zero and something less than what's
ordinarily advisable for a "typical" source code change/recompile trip.
Please consult with ABO technical experts to help decide where precisely
that "in between" should be given your particular environment and
applications.

--------------------------------------------------------------------------------------------------------
Timothy Sipples
IT Architect Executive, Industry Solutions, IBM z Systems, AP/GCG/MEA
E-Mail: sipp...@sg.ibm.com

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to