Amit,
>From the thread you started on comp.lang.apl it seems that your files are only 
>about 50,000 rows. As Richard suggested in that thread - that isn't very big. 
>You shouldn't have to resort to memory-mapped files for something that big. If 
>you just want to import the data from the Excel spreadsheet and then work with 
>it in J then I wouldn't imagine that will present too much of a problem.

I just used the Tara addon for J to create an Excel workbook with a sheet 
containing 50,000 rows and 10 columns. I then used Tara to read that data from 
the Excel worksheet back into J.

Importing (reading) the file into J took a number of seconds using Tara, but 
once the array was in J, I was able to operate on the whole (or parts of the) 
array quickly and easily. For example getting the sum of all 10 columns over 
the 50,000 rows of the resulting array took 0.0057 seconds in J.

I would still suggest that if you really want to know if what you are trying to 
achieve is possible and/or sensible, then you should give more a bit more 
detail.


--- Sherlock, Ric wrote:
> ---amit bolakani wrote:
> > I was considering using J for building a commercial
> > application which would
> > need to handle huge excel spread sheets on Windows systems with good
> > performance. Does J handle this well? I would believe that
> > one would need to
> > use memory mapped files for this. Are memory mapped files
> > functionality
> > developed well enough with J for me to consider it as my
> > language of choice
> > for this project?
>
> I think that you will get more helpful answers if you explain a bit
> more what you mean by "handle huge excel spread sheets".
>
> What do you want to do with them (or the information on them)?
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to