--- On Sun, 3/29/09, Paul Gilmartin <[email protected]> wrote:
------------snip------------------------------
> But it's worse. Since the only hazard I perceive
> would occur when
> an extent is freed while another task has a DEB for that
> extent,
> I once proposed in these pages that at OPEN time, when the
> VOLSER(s) are known an additional ENQ SHR on DSN and VOLSER
> be
> issued; when an extent is freed, an ENQ EXCL be issued,
> ABEND
> if not immediately satisfied.
>
> An IBM employee countered that there are system processes
> which
> ENQ on DSN (not VOLSER) and manipulate the extents while
> never
> performing an OPEN nor creating DEBs. QED? Not
> QED in my
> perception; omitting the OPEN is bad practice which should
> be corrected.
>
> -- gil
Gil:
This is a somewhat difficult situation (but I agree that it should be fixed).
EXAMPLE DFDSS is the only IBM culprit (Possibly FDR) does this as well for
making dump copies. They need to bypass a lot of system stuff in order to do
their job. On a full volume dump (last I looked) DFDSS bypasses open and they
play around with altering in storage control blocks to do this. That is the way
it was designed (I think FDR does this as well). They may have added a RACF
call to see if the user is authorized to read the dataset (I am not certain but
it sounds reasonable that they would make a call to RACF to see) once you are
"authorized" to read the dataset they are dumping it dumps it they just alter
the DEB to point to the right extent(s) and start reading. Nothing magical
about it. It is "needed" to be able to dump entire volumes so they take the
tack that has been around since day 1 BUT as you indicated there is a slight
issue as they do not open the dataset
per se and that is where the issue lies(sp?)
The dump program MUST be able to access everything that is needed and in as
short as time as possible and that is (partially) what they do. If they had to
open each dataset it would take a *LOT* longer. Dumping a volume by dataset
does take longer because then DFDSS must open the dataset to do the enque.
Right or wrong (you can take any side) but in order to make a proper dataset
copy you should be on a system in the complex that all systems know about (ie
GRS or MIM) any dataset that is being updated or read and pass the information
to the other system(s) via GRS type communication. Otherwise you would not get
a valid backup copy. I heard there is a product that will allow the system to
keep an in sync open dataset. I do not know how it works but it almost has to
be at the control unit level.
Ed
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html