2005/12/13, Mathias Bauer <[EMAIL PROTECTED]>: > Randomthots wrote: > > Mathias Bauer wrote: > > > >> > >> So possibly the size of the Calc document created from the file (the > >> memory consumption of Calc itself) caused the swapping you experienced > >> but not the xml content itself that (as outlined above) never is read > >> into memory as a whole. This is what Daniel tried to point out: it's > >> Calc itself that consumes the memory, not the bytes of the file. > >> > >> Best regards, > >> Mathias > >> > > > > Well that leaves me back where I started. Why is* there such a god-awful > > speed penalty compared to loading the same data from csv or xls? We're > > talking minutes for the ods vs. seconds for the other files -- orders of > > magnitude. And more to the point, what was causing all the disc > > thrashing? I mean it shut down FC3 for a good thirty minutes. > > You are not alone, many people wonder and the developers already started > trying to solve this problem. There are a lot of known problems as well > as some ways to fix them but this is something that needs to be > investigated carefully. Expect to see something happen in future versions. > > It's too much to discuss this on a list like this one. My personal > presumption is that the disk thrashing is caused just by the memory > consumption of Calc loading the big file (not by the file itself). You > can try to observe this by yourself when you load a big file. > > Of course reading Calc (ODF) XML will never be as fast as reading a csv > file but it *can* become fast enough to be bearable. Let's wait and see. > > Best regards, > Mathias > > -- > Mathias Bauer - OpenOffice.org Application Framework Project Lead > Please reply to the list only, [EMAIL PROTECTED] is a spam sink. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > >
I found this link: http://sw.openoffice.org/drafts/optimization.html (it says it is from 2002, but I geuss it is not) This is under work according to the link: When importing, whole streams are read into memory at once and are then read from memory. They should be uncompressed and written to temporary files and be read from there instead. If the stream size is less than a nominal value (the SvCacheStream class which does something similar uses the value of 20k) then the file will instead be stored in memory. Is the complete XML loaded into memory after all? /$ --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
