Dear experts,
I run a code, where I read 800.000 records from a database, but what I
obtain is the error "out of memory: heap....".
I use the following cayenne commands:

        Expression exp_date = Expression.fromString("dateRetrieval >=
$start_d and dateRetrieval < $end_d and type=$type");//("type=$type");
        Map parameters =new HashMap();
       parameters.put("end_d", "2006-03-09" );
       parameters.put("start_d", "2005-10-07" );
       parameters.put("type", "Job");
       exp_date = exp_date.expWithParameters(parameters);
       final SelectQuery feedsquery = new SelectQuery(FeedsAll.class,
exp_date);
       int count=0;
       try{
                
        final ResultIterator it_q =  context.performIteratedQuery(feedsquery);
                while(it_q.hasNextRow()){               
        
                       final Map row = it_q.nextDataRow();
                       final FeedsAll obj = (FeedsAll)
context.objectFromDataRow(FeedsAll.class, new DataRow(row), true);
                     .....
                     context.deleteObject(obj);
                }
       }
       catch(Exception e){
           //System.out.println("Fatal Error: "+e);
           log.error("Fatal Error: ",e);
           log.info(e.getStackTrace());
       }

what is it wrong? I understood that using the performIteratedQuery,  I
could read a huge number of records without memory problem.
Can you help me?

Thanks a lot
Marco

Reply via email to