Not that I can think of. I would expect this anytime gigs are loaded into memory. Best of luck, -jacobd
On Wed, Feb 3, 2010 at 12:50 PM, dg69 <[email protected]> wrote: > > I have to convert a flat file to and XML file so that a application can load > the data on a regular basis. The flat files are huge( because of the record > lenght and millions of records) in gigs and when I convert them I get an out > of memory error after lets say 100,000 records. And I expected that since > this is memory based. I have added more mem to the JVM and I can get another > 50, 000 records processeds. > Once way to avoid this is to create smaller XML files for example create > file for 50, 000 recs and then start another one. Is there a better > approach? > -- > View this message in context: > http://old.nabble.com/File-Conversion-tp27443213p27443213.html > Sent from the Xml Beans - Dev mailing list archive at Nabble.com. > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > > --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
