Bill Woodger scribbled :-) That may depend on how you take this claim: 
"[ABO] ... produces a functionally equivalent executable program".

This is yet another reason why I'd like the ABO folks to say more about 
the kinds of program transformations they do.

Cheers, Martin

Martin Packer,
zChampion, Principal Systems Investigator,
Worldwide Cloud & Systems Performance, IBM

+44-7802-245-584

email: [email protected]

Twitter / Facebook IDs: MartinPacker

Blog: 
https://www.ibm.com/developerworks/mydeveloperworks/blogs/MartinPacker

Podcast Series (With Marna Walle): 
https://developer.ibm.com/tv/category/mpt/



From:   Bill Woodger <[email protected]>
To:     [email protected]
Date:   13/04/2016 12:01
Subject:        Automatic Binary Optimizer (ABO)
Sent by:        IBM Mainframe Discussion List <[email protected]>



ABO should be useful for sites not going to V5/V6 "anytime soon"

The following presumes a "12" or "13" system, and z/OS 2.1 or higher.

1. Staying with pre-V4.2, bimbling along with out-of-service compilers

Pre-Enterprise COBOL programs, would need recompile with Enterprise COBOL 
to allow ABO. For any already Enterprise COBOL executables, the ABO will 
operate and give benefit.

2. Migrating to V4.2

Even after recompiling with V4.2, ABO would give benefits, access to "new" 
instructions, and advanced optimisation techniques.

ABO could be useful to sites migrating to V5/V6

1. Really rapid migration, probably no point in ABO ("Really rapid" is 
subjective).

2. Long process of migration, ABO would be useful for heavy CPU-users 
which are scheduled to migrate a year or more down the track ("Long" is 
subjective).

An interesting question is whether, or to what extent, you "test" 
optimised programs.

That may depend on how you take this claim: "[ABO] ... produces a 
functionally equivalent executable program".

I know that there are some people/sites who don't "trust" the existing 
optimisation in the compilers. Obviously they are not going to be 
ABO-users anyway.

IBM's advice for the use of OPT (even prior to V5) is to "whack it on just 
before you go into Production". I've never liked that, but not because I 
don't trust the optimiser. I don't trust the application code :-) Anyone 
who has ever encountered an accidental overlay of executable code is 
probably aware that "incidental" overwriting can also be occurring. And it 
can be so incidental as to "work to specification" until something changes 
in the code and now-significant code gets overwritten.

I don't want a program to go through N stages of "testing" and then have 
the object code changed (so that something may become significant) just 
before tossing it into Production. So I recommend OPT after initial 
program-testing.

So, should programs after ABOing be "tested", to partially test ABO? 
Should ABO be "tested" in isolation (a wider coverage of things in a 
shorter period of time) or - perhaps taking ABO-stability into account (no 
"program doesn't work" fixes yet for ABO) - just go with the flow?

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN



Unless stated otherwise above:
IBM United Kingdom Limited - Registered in England and Wales with number 
741598. 
Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to