Hi,
Is this a trick question?  :-)  By setting the FetchBatchSize to MAX_VALUE,
then you are asking OpenJPA to traverse all relationships and instantiate
all entities that are touched.  If your object graph is involved, this could
turn out to be a rather large result set and, depending on the memory
available, you could easily exceed your memory and get an OOM exception.  By
your experimentation, you have already figured out that specifying a
concrete value works just fine.  So, I'm just confused as to what the real
question is...

More information on FetchBatchSize can be found here:
http://openjpa.apache.org/builds/latest/docs/manual/manual.html#ref_guide_dbsetup_lrs

Kevin

On Mon, Nov 16, 2009 at 5:09 AM, dileep55 <[email protected]> wrote:

>
> Hi,
>
> Environment:
>
> Apache ODE : 1.3.4
> Open JPA : 1.3.0 snapshot
> DB : oracle 9i
>
> In BpelDAOConnectionImpl of Apache ODE, instanceQuery(instanceFilter),
>
> when the setFetchBatchSize is Integer.MAX_VALUE, it throws an out of memory
> exception.
>
> However when the batch size is set to some value like 0/100,the results are
> fetched correctly without any issues.
>
>
> Does anyone have nay idea about this kind of behaviour.
>
> Thanks
> Dileep
> --
> View this message in context:
> http://n2.nabble.com/OpenJPA-Fetch-plan-setFetchBatchSize-gives-out-of-memory-error-when-used-with-Oracle-9i-tp4011421p4011421.html
> Sent from the OpenJPA Users mailing list archive at Nabble.com.
>

Reply via email to