Hi!
> When extracting large zip packages (~1.6Gb), Java heap space error is
> thrown. Currently used attribute -Xmx500M isn't enough. Memory usage rises
> gradually during extraction (slowly, but surely). We have a need to handle
> packages up to 2Gb size.
>   
Do this zip file contains many entries?
If this is the case then we are out of luck currently. VFS reads the
whole directory structure in memory.
I dont see what we can do to avoid this without introducing a HUGE
performance impact.

If this is not the case it would be nice if you could start up a
profiler to see what uses so much memory, maybe its again a
java.util.zip limitation? Did you try some simple code using
java.util.zip directly?

Ciao,
Mario


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to