On 13 October 2016 at 06:41, Timothy Sipples <sipp...@sg.ibm.com> wrote:
> OK, about testing. For perspective, for over two decades (!) Java has
> compiled/compiles bytecode *every time* a Java class is first instantiated.
That seems a little, uh, optimistic. The Java Programming Language
book, and the corresponding Java Virtual Machine Specification, first
editions, were both published in 1996. Did IBM even have a mainframe
Java at that point (beyond perhaps compiling Sun's published C code on
MVS), let alone one doing JIT optimization?
> The resulting native code is model optimized depending on the JVM release
> level's maximum model optimization capabilities or the actual machine
> model, whichever is lower. Raise your hand if you're performing "full
> regression testing"(*) before every JVM instantiation. :-) That's absurd,
> of course. I'm trying to think how that would even be technically possible.
> It'd at least be difficult.
> ABO takes the same fundamental approach. From the perspective of an IBM z13
> machine (for example), ABO takes *COBOL* "bytecode" (1990s and earlier
> vintage instructions) and translates it to functionally identical native
> code (2015+ instruction set). It's the core essence of what JVMs do all the
> time, and IBM and others in the industry have been doing it for over two
> decades. Except with ABO it's done once (per module), and you control when
> and where.
Well... Sort of. Seems to me the problem is that while the JVM
instruction stream is architected as a whole (i.e. it's not just the
definition of what each instruction does, but an entire Java class
that must match the specs), the output of an old COBOL compiler can't
be said to be architected like this. (Probably there is some internal
doc describing what the compiler generates (and of course the old
compiler source code is definitive) but I'm willing to bet that it's
nowhere close to the standards of any generation of the Principles of
Operation.) The Principles of Operation defines very accurately and
completely what each instruction emitted by the old compiler does, but
lacks any notion of larger groupings along the lines of Java class
files. But the ABO product is certainly not just translating
individual old instructions into newer ones. Rather, it is surely
identifying COBOL paradigms based on some kind of pattern matching,
and then retranslating those COBOLy things into modern machine code.
Presumably it effectively reverse engineers the byte codes (whether
back into COBOL, or much more likely into an intermediate
representation that is already accepted by later stages of the modern
compiler middle or back end) and generates a new object module. If it
really was just converting old instructions into better performing
modern ones, the input would not be constrained to be COBOL-generated.
Any object code could be optimized this way.
This all says that the analogy between converting Java bytecodes into
native instructions and converting old COBOL object code into modern,
is weak. And it follows that the need for testing isn't the same.
There is much more to go wrong in the ABO process than there is with
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN