Hi,
I don't think this problem is castor-related.
> I'm unmarshalling large xml file (about 100MB size) ...
> during unmarshalling i'm getting
> java.lang.OutOfMemoryError ... instead of i got 512 MB RAM ... and only
260
> MB is allocated during unmarshalling ...
First of all, I think that it is not the best idea to unmarshal the whole
100 MB file to RAM at all. If it is possible, it would be better if you
process this file in chunks. I mean - go through it with a SAX parser to
divide it to "units of work", and uncastor each unit of work and process it
(eg. add to database) immediately. This way you would need no more than 10
MB of RAM.
And if it is not possible, please change the JVM settings (for example call
it with "java -Xmx 400m ...").
Michal
-----------------------------------------------------------
If you wish to unsubscribe from this mailing, send mail to
[EMAIL PROTECTED] with a subject of:
unsubscribe castor-dev