Hi,
I am noticing a big performance drop when executing the same method twice
within two separate transactions but with the same entitymanager involving a
large amount of objects. The method looks looks like this:
public void deleteObjects() {
pm.currentTransaction().begin();
Set allObjectsToDelete = new HashSet();
collectObjectsToDelete(objectIDs, allObjectsToDelete);
updateReferences(allObjectsToDelete);
pm.deletePersistentAll(allObjectsToDelete);
pm.currentTransaction().commit()
}
The allObjectsToDelete contains about 20000 objects. When executing this
method for the second time, there is a performance hit of a factor 3 to 4.
When profiling this, in the second run a new hotspot pops up in
org.apache.commons.collections.map.AbstractHashedMap.clear() This takes
413200 ms. In the first run this only takes 213 ms! (There is only 10%
increase in the number of invocations) I've included some screenshots of the
profiling data. Actually, after the first delete I notice that many methods
(like iterating a single pc collections or retrieving other fields) have
dropped in performance, I guess this is related?
kind regards,
Christiaan
--
View this message in context:
http://www.nabble.com/Performance-drop-in-AbstractHashedMap.clear%28%29-tf4769771.html#a13643473
Sent from the OpenJPA Developers mailing list archive at Nabble.com.