It is often quoted mainframe files compress 4 to 1.  Might already be
compressed, and vsam overhead might not compress as well.  So I would
try 20%, 10% of input file size, and be sure to include enough free
space on enough volumes for the base file size to spread out.  PS has
a 15 or 16 extent limit across all volumes.

On Fri, Dec 19, 2025 at 12:00 PM Jon Perryman <[email protected]> wrote:
>
> On Fri, 19 Dec 2025 19:06:45 +0400, Peter <[email protected]> wrote:
>
> >Is there a sample JCL to dump the dataset to a compressed multi volume
> >dataset ?
>
> Doc for dump logical dataset 
> https://www.ibm.com/docs/en/zos/2.1.0?topic=dfsmsdss-dump-dataset-syntax-logical-data-set
>  which should be reviewed for options you may need. You'll find RESTORE in 
> this same manual.
>
> //STEP1    EXEC  PGM=ADRDSSU
> //DUMPDD    DD    DSN=BACKUP.DSN,DISP=(NEW,CATLG),
> //    SPACE=(CYL,(50000,50000),RLSE)
> //SYSPRINT DD    SYSOUT=A
> //SYSIN    DD    *
>  DUMP OUTDD(DUMPDD) -
>    DS(INCL(DATASET.TO.DUMP)) -
>    /* TOL(ENQF) WAIT(0,0) /* if you can't quiesce the dataset and accept the 
> risks */ -
>     COMPRESS /* or HWCOMPRESS */
> /*
>
> >So what would be the recommended size to dump on a compressed dataset?
>
> Since it's a vsam dataset and you are compressing the backup to use as little 
> space as possible. Additionally, ADRDSSU uses full track. Look at the options 
> to see if there is anything relevant to your situation.
>
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions,
> send email to [email protected] with the message: INFO IBM-MAIN



-- 
Mike A Schwab, Springfield IL USA
Where do Forest Rangers go to get away from it all?

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to