Try using the low level API and do keys only query with limit of 1000 (and delete them) repeatedly instead of retrieving whole objects.
I am guessing the out of memory is due to large amount of objects returned by the query. Keys only query also use much less api cpu. On Feb 12, 9:22 pm, Benjamin <[email protected]> wrote: > I'm getting errors when a task kicks off to delete a lot of data based > on a timestamp. I enabled billing and already chewed through $0.50 in > CPU time, but i'm still getting the error message. Is there anything > else I should do? I was trying to avoid splitting the task up with a > result limit or something, i really just need to blow away persisted > objects that have a timestamp older than a specified date - this > snippet of code causes the error: > > PersistenceManager pm = PMF.get().getPersistenceManager(); > Calendar d = Calendar.getInstance(); > long retVal = 0; > if (expDays > 0) > { > d.add(Calendar.DATE,(expDays * -1)); > Query q = pm.newQuery(RecordedValue.class,"pointFK== > k && timestamp > < d"); > q.declareImports("import java.util.Date"); > Map<String, Object> args = new HashMap<String, > Object>(); > args.put("k",pointId); > args.put("d", d.getTime()); > retVal = q.deletePersistentAll(args); > > } > pm.close(); > return retVal; -- You received this message because you are subscribed to the Google Groups "Google App Engine for Java" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine-java?hl=en.
