With current emulated DASD and PAVs, performance is probably no longer
an issue, but I believe multiple page data sets on one volume is still a
potential availability issue:  You wouldn't want failure of a single
emulated drive to compromise two different systems at the same time, and
I seem to recall it used to be fatal to have failure of multiple page
data sets on the same system at the same time.
        Joel C. Ewing

On 12/03/2014 04:37 PM, Ed Gould wrote:
> Derrick:
> 
> *myself* I would never put multiple page DS's from multiple systems on
> the same drive.
> The rule I have observed for 40+ years. Nor would I place multiple page
> ds's on the same drive from one system.
> Dedicate a drive for each page ds.
> 
> Ed
> 
> On Dec 3, 2014, at 3:00 PM, Derrick Haugan wrote:
> 
>> We built our current paging configuration years ago, when the max #
>> slots per page dataset was 1M. At the time we wanted to use mod-27's,
>> and so we staggered multiple page datasets per volume, from different
>> systems (happen to be in the same sysplex) so as not to waste space on
>> the volume (we did not place any other type of datasets on the paging
>> volumes).
>>
>> We have always used PAV's (now HyperPavs) for paging for performance
>> reasons, and have never had performance problems with this
>> configuration  (but we dont do alot of paging).
>>
>> We are preparing to use flash express on our EC12 based systems, and
>> are considering reconfiguring page datasets on either mod-27's or
>> mod-54's using 1 local page dataset per volume, as when using the
>> flash/SCM for paging, only VIO pages will go to DASD for paging under
>> normal circumstances. This would simplify paging volume configuration.
>> Currently on our mod27s there are 12 paging slots per track (run an
>> IDCAMS/LISTCAT of the page dataset). So a mod27 with 30051
>> cyls/450,765 trk would hold 5,409,180 slots or roughly 20.6GB.
>>
>>
>> example of what we have been using: (mod27, 5-member sysplex , one
>> local page dataset from each system on the volume)
>>
>> CCCCC-HH -------- D A T A S E T   N A M E ----------- Org   Trks
>> 00000 00 VTOC POINTER                                         VP       1
>> 00000 01 SYS1.VTOCIX.PAG300                             PS      14
>> 00001 00 VTOC                                                     
>> VT       75
>> 00006 00 SYS1.PAGE27.A090.PLPA.DATA               VS   5100
>> 00346 00 SYS1.VVDS.VPAG300                               VS      10
>> 00346 10 * * * F R E E   S P A C E * * *                 FS       5
>> 00347 00 SYS1.PAGE27.A090.LOCAL1.DATA           VS   87375
>> 06172 00 SYS1.PAGE27.N090.COMMON.DATA         VS    8550
>> 06742 00 SYS1.PAGE27.N090.LOCAL8.DATA          VS   87375
>> 12567 00 SYS1.PAGE27.G090.LOCAL1.DATA           VS   87375
>> 18392 00 SYS1.PAGE27.J090.LOCAL8.DATA            VS   87375
>> 24217 00 SYS1.PAGE27.Y090.LOCAL1.DATA            VS   87375
>> 30042 00 * * * F R E E   S P A C E * * *                  FS     135
>> 30051 00 END OF VOLUME                                      EV       0
>> ----------------------------------------------------------------------
>> For IBM-MAIN subscribe / signoff / archive access instructions,
>> send email to [email protected] with the message: INFO IBM-MAIN
> 
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions,
> send email to [email protected] with the message: INFO IBM-MAIN
> 


-- 
Joel C. Ewing,    Bentonville, AR       [email protected] 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to