I'm creating a CLOB in my client code using:
object.setClob( new ClobImpl( new FileReader( file ), file.length() );
Upon writing a file that is 4k or a file that is 4mb, I receive the error:
[test] java.sql.SQLException: ORA-24804: illegal parameter value in lob write
function
[test] at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:168)
[test] at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:326)
[test] at
oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.java:1460)
[test] at
oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java:1371)
[test] at
oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1900)
[test] at
oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:363)
[test] at org.exolab.castor.jdo.engine.SQLEngine.store(SQLEngine.java:849)
[test] at org.exolab.castor.persist.ClassMolder.store(ClassMolder.java:1544)
[test] at org.exolab.castor.persist.LockEngine.store(LockEngine.java:745)
[test] at
org.exolab.castor.persist.TransactionContext.prepare(TransactionContext.java:1162)
[test] at
org.exolab.castor.jdo.engine.DatabaseImpl.commit(DatabaseImpl.java:498)
[test] at com.iwitness.prototype.Test.run(Test.java:203)
[test] at com.iwitness.prototype.Test.main(Test.java:46)
Upon writing a file that is 100mb, I receive the error:
OutOfMemoryError
<no stack trace available>
In the mapping descriptor, I've disabled caching for this class and I've disabled
dirty checking for the CLOB field.
Does anyone know why I might be getting the OutOfMemoryError? Or even the Oracle
error? Any ideas/clues?
Thanks,
Bruce
--
perl -e 'print unpack("u30","<0G)U8V4\@4VYY9&5R\"F9E<G)E=\$\!F<FEI+F-O;0\`\`");'
-----------------------------------------------------------
If you wish to unsubscribe from this mailing, send mail to
[EMAIL PROTECTED] with a subject of:
unsubscribe castor-dev