Am 13.11.2012 09:55, schrieb Harry Yu:
hello,     I met a trouble when I used XSSF to read the 07 file which has 75W 
records. And its size is about 30M. When I use XSSFWork to load the file. The 
error would appear : java.lang.OutOfMemory : Java heap space.  I think that the 
problem is the size of file too large. And JVM can't support enough space.
     I don't want to simply increase the memory allocation for JVM. Because 
it's not always useful , if the file which I operate is too much large to over 
the size of RAM.
     I would like to know the solution how to load a large amount of records by 
chunks for downstream processing. Those which api can solve this.  Or anyone 
have better solutions.

    Thanks
   Harry Yu


perhaps you can use the Streaming API SXSSF ?

regards
Bodo


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to