Phillipe,

You could:
0. Use an alpha release which I believe has a few
fixes for memory issues
1. Use event API if you want to read only a subset of
cells in a large file
2. Increase the initial JVM heap setting (increase the
maximum heap setting if you get OutofMemory)
3. Use jdbc-odbc driver to read the file using SQL if
you are running on Windows (google:excel jdbc)


hth,
~ amol


--- Philippe DesRosiers
<[EMAIL PROTECTED]> wrote:

> I'm trying to open a very LARGE excel file (around
> 45 thousand rows, but
> only seven or eight columns), but HSSF seems to hang
> during the creation of
> the HSSFWorkbook. Here's some code:
> 
> <snip>
>             content = new FileInputStream(myFile);
>             POIFSFileSystem fs = new
> POIFSFileSystem(content);
>             HSSFWorkbook workbook = new
> HSSFWorkbook(fs);      //code hangs
> on this line.
> </snip>
> 
> Does anyone have any ideas? Does HSSF just not
> support large files? What is
> the largest file size supported? Can I enable
> logging in HSSF to get more
> information?
> 
> thanks,
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
Mailing List:     http://jakarta.apache.org/site/mail2.html#poi
The Apache Jakarta Poi Project:  http://jakarta.apache.org/poi/

Reply via email to