Generally, optimisation comes down to "not doing this particular thing will 
allow the program to use fewer resources", "and this one", "and this one as 
well". At times, probably mostly, you'll need to replace code, rather than just 
lose it, but just losing it is the fastest optimisation (it is not unknown to 
discover that a program up for optimisation is, in fact, entirely redundant, or 
can be readily made so).

ABO optimises the actual code. What it has to work with depends on what is 
already there: compiler options and techniques/local standards.

For instance, for a program using "USAGE DISPLAY" numeric data and compiled 
with NOOPT, the ABO will make better savings than for a logically equivalent 
program with packed-decimal/binary numerics as appropriate and already compiled 
with OPT. ABO will still improve that (likely) but not by as much as the other.

Compiler options used (those which affect generation of code) and how the code 
is written will dictate the range of possible improvement. High CPU reduction 
may (strongly may) indicate that the starting-point was not a good one (from a 
performance point of view).

On the other hand, saving 30% on 1,000 programs which don't "do" much, but 
which would add up over the year, is most effectively done by doing nothing but 
using a tool.

How to approach optimisation depends on the requirements of the task. As with 
ABO, spending a little can save more than was spent :-)

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to