Hi, I'm reading a huge table with openJPA in chunks, like this:
List list = query.setMaxResults(batchSize).setFirstResult(position * batchSize).getResultList(); And iterate over it like this: for (Entity e : list) { //entityManager.detach(e); } Now my Java Heap Space fills slowly up to around close to 100%, then stays around that value, sometimes it starts fluctuating... Now I uncommented entityManager.detach(e), but it behaves more or less the same, it fills my whole heap space. In small tests I have not gotten an OutOfMemoryException yet, but the bigger production app I need this for already got some after some runtime... I don't really understand this behavior, is there anything I can do about it? I already tried removing the detached state field from the Entity but that does not seem to help either. What does help is repeatedly calling entityManager.clear() to clear the whole context, but this solution might not always be desireable. So, besides possible solutions I also wonder if this behavior is "normal" or whether I should try to create a small testCase for this aswell. Regards, Michael ___________________________________________________ SMA Solar Technology AG Aufsichtsrat: Guenther Cramer (Vorsitzender) Vorstand: Juergen Dolle, Roland Grebe, Uwe Hertel, Pierre-Pascal Urbon, Marko Werner Handelsregister: Amtsgericht Kassel HRB 3972 Sitz der Gesellschaft: 34266 Niestetal USt-ID-Nr. DE 113 08 59 54 WEEE-Reg.-Nr. DE 95881150 ___________________________________________________