On Tuesday 21 July 2009, Bob Woodside wrote:
>
>    I don't understand why there would be a boundary condition at 7GB. 
>4GB, yes, but not 7GB. Have you tried zipping a file that is just a 
>little larger than 4 GB? 

Sorry about that - Yes, you are right, problem is the 4GB boundary. I
had to create me a file of over 4GB and tried is and it too failed with
:

zip error: Entry too big to split, read, or write (Poor compression
resulted in unexpectedly large entry - try -fz)   

>    Whew! Sorry, I can't help you on this one. I'm barely able to
scrounge >up enough resources to test a 4GB file. I can't even come
close to a 46GB 
>file.

I do understand, space it a problem. I didn't believe it myself when I
saw this one.

We have odd sized data files ranging from 10MB ; 3GB ; 7GB; 10GB ; 15GB
;  30GB & the huge 46GB. All with various record lengths which made
finding the actual size not as simple.

Just gone through the sizes of the zipped files and possibly found the
cause of 46GB file failure. All the other output (zipped) file sizes are
less than 2GB. The 30GB file zipped to just less than 2GB & worked fine.
It appears when the OUTPUT (zipped) file size is > 2GB that it fails
with  :

 zip I/O error: EDC5013I No hiperspace blocks are available for
expansion.
 zip error: Output file write failure (write error on zip file)


Thanks for all the assistance Bob, it is very much appreciated.

Vikesh
Please Note: This email and its contents are subject to our email legal notice 
which can be viewed at http://www.sars.gov.za/Email_Disclaimer.pdf 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to