I did search the archives for this one, but I did not find any references.

Is there some way to measure how many BLSR buffers (data and index) were
actually used after a job has completed?  For instance, if
BUFND=2700,BUFNI=180 was specified in a SUBSYS=(BLSR,...) DD statement, how
could I discover how many of the 2700 data buffers actually were used for
reading or writing some CI in a VSAM KSDS?  

And if such a method exists, can ordinary application programmers (e.g.,
those with no authority to use DCOLLECT) use the method?

Obviously, this is a tuning effort at the application level.  I know that
BLSR dramatically reduces my EXCP count in this application while slightly
raising the CPU time, but I have a need to measure how much of the allocated
memory is actually being used by the BLSR specification, and how much is
excess unused space.

TIA for any info/url/RTFM you can provide.

Peter
--
As of 04/01/2007 I have a new email address, so please update your address
book: Peter dot Farley at broadridge dot com


This message and any attachments are intended only for the use of the addressee 
and
may contain information that is privileged and confidential. If the reader of 
the 
message is not the intended recipient or an authorized representative of the
intended recipient, you are hereby notified that any dissemination of this
communication is strictly prohibited. If you have received this communication in
error, please notify us immediately by e-mail and delete the message and any
attachments from your system.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to