It is the first time, that I write to this list and I am not quiet sure, if I 
am on 
the right place for asking my question. I hope not being off-topic. If so, 
please 
accept my apology for my ignorance in advance.

I have a programm written in Cobol which runs monthly on a z/Os-System. The 
programm reads inputrecords and writes them in a database (DB2) and in 
different outputfiles. It consumes nearly the same numbers of input records 
every month. I wonder now, why there are so great differences in the 
consumed cpu-time. For example: Last month the programm consumed due to 
the job protocol 11.25 (what ever that means) and this month it consumed 
16.33. Actually I expected the cpu time would be nearly the same. But there is 
this difference I can not explain. Any suggestions would be greatly appreciated.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to