"But the ABO product is certainly not just translating individual old 
instructions into newer ones. Rather, it is surely identifying COBOL paradigms 
based on some kind of pattern matching, and then retranslating those COBOLy 
things into modern machine code. Presumably it effectively reverse engineers 
the byte codes (whether back into COBOL, or much more likely into an 
intermediate representation that is already accepted by later stages of the 
modern compiler middle or back end) and generates a new object module. If it 
really was just converting old instructions into better performing modern ones, 
the input would not be constrained to be COBOL-generated. Any object code could 
be optimized this way."

The ABO will only operate on COBOL programs, and only from a very specific list 
of compilers, which, with the coming 1.2, impressively goes back to the later 
versions of VS COBOL II as long as they are LE compliant. (When I first came 
across mention of what became the ABO I was sceptical that there would be 
sufficient non-Enterprise COBOL executables around to make it worthwhile, but 
looks like I was considered to be wrong on that).

It was perhaps wrong to call it an "Optimizer", as people then presume that it 
is doing things that the optimizing engine in the compiler can do. It doesn't. 
It more "soups up" rather than optimizes. Yes, that broadly optimizes things, 
but ABO does not touch many things that the optimizer treats effectively.

No, it doesn't turn the machine-code (ESA/370) into anything intermediate. Yes, 
it knows something of the internals, and yes it knows things it can and can't 
do with that knowledge.

"There is much more to go wrong in the ABO process"

Why, and with what supporting information?

For IBM-MAIN subscribe / signoff / archive access instructions,
send email to with the message: INFO IBM-MAIN

Reply via email to