When you commit and there is an active batch, then iBATIS will automatically execute the batch before the commit.  So you can do this to cause intermediate commits in a large batch:
 
try {
  startTransaction();
 
  startBatch();
  ...100 inserts
  executeBatch();  // this line is optional
  commitTransaction();
 
  startBatch();
  ...100 inserts
  executeBatch();  // this line is optional
  commitTransaction();
 
  etc.
} finally {
 endTransaction();
}
 
You can also do it this way (which will cause one large commit):
 
try {
  startTransaction();
 
  startBatch();
  ...100 inserts
  executeBatch();
 
  startBatch();
  ...100 inserts
  executeBatch();
 
  etc.
 
  commitTransaction();
 
} finally {
 endTransaction();
}
 
But then you'll have one large commit - which could be the source of your problem.  I would go with option 1.
 
Jeff Butler
 

 
On 6/21/06, jaybytez <[EMAIL PROTECTED]> wrote:

In this example method that I have, do I want to do a commitTransaction
everytime after I do a executeBatch...or at the end of my batch process.

So if I have 1000 records and I execute a batch of them every 100 records.
Do I go to 100 then executeBatch and commitTransaction, then go to the next
100 and executeBatch and commitTransaction?  Or do I executeBatch for every
100 records and then commitTransaction?

The reason I ask is that I am running a batch process that inserts 100,000
or so records and my WebLogic instance keeps crashing with OutOfMemory (even
better that a rollback cannot occur so half of my transaction commits).  I
don't have this issue running regular SQL, so I am trying to discover what I
am doing wrong to cause this performance hit.

Thanks,

-jay blanton
--
View this message in context: http://www.nabble.com/executeBatch-not-returning-int-of-records-updated.-t1819686.html#a4978497
Sent from the iBATIS - User - Java forum at Nabble.com.


Reply via email to