Cobol will pack before doing a compare for when the operands in the compare are not the same data type and atleast one operand is numeric.
I am a bit concerned about the phrase pack/unpack pairs. Cobol will not do a pack/unpack pair for a compare ... is there something you failed to mention like a math operation? You are fortunate that the shop chose the 360/370/etc family of computers. pack/unpack instructions and indead the entire world of decimal arithmatic supported by native hardware instuctions is rare. Most hardware types you would see a call to an internal subroutine. It would not be easy to detect what the subroutine was doing but most likely it would be converting to the more universal numeric format ... double presion floating point! I would not suggest porting this application to a PC for example. Cobol compilers like most mainframe HLL has an optimizer. If the same source variable was used many times in succession as part of a compare that required it to be packed it would not be packed for each reference. I would guess, based on the statement pack/unpack pairs that there are assignment statements involved, not just compares. There are Cobol compile options that will produce a listing that will show you both the sort statement and the corresponding low level code, You might wish to use these features rather than guessing what the source statement might of been. Application programs are rarly opened up just to fix performace problems. There is a famous story in the history of computers. Gene Amdahl, the father of the IBM 360 started his own computer company (It was called Amdahl) The 360 was designed to make it easy to write assembler programs ... hence instructions like pack and unpack. At the time Amdahl computers hit the market the popular thing people wanted was speed, not ease of codeing. Pack and unpack cause a problem in the speed world because they are a class of instructions called SS (Storage to Storage) and defy attemps at internal pipelining which was the method of the day to achive speed. Why then did Gene Amdahl go into the plug compatable market place rather than produce a new instruction archicture like a RISC machine tha could of be bigger, faster, and more cost effective? (No he was not a fool, guess again) He reasoned that bigger, faster and more cost effective usually did not justify application rewrites in the software development world. This has been even more true since Amdahl's day with declineing costs for comparable hardware and increasing costs for both people and software. ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO Search the archives at http://bama.ua.edu/archives/ibm-main.html

