Jason,
 Generic efforts like you mentioned below are generally fine but they don't 
address specific issues. In fact in some cases they may have the adverse 
effect e.g. adding index to a DB2 table will not help a job that mainly does 
INSERTs. Tuning can sometimes feel like robbing Peter to pay Paul.
 You need to identify where the problem lies - which job/step is taking more 
time than it used to. Has there been an increase in data volume being 
processed ? Where is the time being spent - more CPU, more I/O or both ? If 
you understand the problem you can probably find a solution. Good diagnosis is 
a prerequisite for a cure, though snake oil sometimes works :)
HTH
Mohammad


On Thu, 27 Dec 2007 02:49:50 -0600, Jason To 
<[EMAIL PROTECTED]> wrote:

>We have been constantly improving our batch window by implementing the
>following available technologies:
>
>1. SMS SMB and data compression on some of our VSAM files
>2. System determined blocksizes on sequential files
>3. BUFNO on some input files
>4. Constantly improving our access to DB2
>    4.1 Add indices
>    4.2 Compress of big DB2 tables
>    4.3 Reorg of DB2 databases
>5. Increase parallelism
>6. Improve DFSORT performance
>7. Improve application efficiencies
>
>We have basically improved the batch window after implementing all
>these, however, after some time, the batch starts to become longer
>again. My question is aside from these, any other thing we can do to
>implement to improve our batch further? TIA.
>
>Regards,
>Jason
>

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to