While Inserting the large data record content say (450kb)using CLOB it is throwing Out of Memory Error.this error is accruing after inserting 200 times data content.Below is the stack trace of it.DBAppender::updateParameters()::Data.Length()::432164 Exception in thread "AppointmentFileListener" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2367) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415) at java.lang.StringBuffer.append(StringBuffer.java:237) at java.io.StringWriter.write(StringWriter.java:101)I tried all the possible solution what I am able to do like-Ensuring all the resultset statement are close after completing the transaction.-Setting the CLOB value with setString(),setClob(),setASCIIStream().-Heap size increment to 1GB and 2GB.Any suggestion on this really appreciated.Thanks
-- View this message in context: http://apache-database.10148.n7.nabble.com/Embedded-Derby-causing-out-of-memory-error-tp143925.html Sent from the Apache Derby Developers mailing list archive at Nabble.com.
