Here's what I have done that seems to work:

1) select the entities you want to delete from the datastore admin
page  as you described
2) Manually monitor the cup usage in the dashboard or quota page
3) Once the quota reaches a high value (say 75-80% for example), go to
the "Task Queues" page of the admin console
4) You should see a "default" queue, that contains the worker tasks
that are deleting your data
5) Use the "Pause Queue" button to pause the default queue
6) Wait until your quota is reset (done once per day)
7) With a new quota, go and resume the default task queue.  Repeat
from step 2. If you are deleting less than 1GB of data, you can
probably do this in one or two passes

Note: How high you let the quota go in step #3 above depends on how
much CPU your app normally needs... make sure to leave enough for your
app to function.


On Nov 14, 4:25 am, Justin <[email protected]> wrote:
> I've been trying to bulk delete data from my application as described
> here
>
> http://code.google.com/appengine/docs/python/datastore/creatinggettin...
>
> This seems to have kicked off a series of mapreduce workers, whose
> execution is killing my CPU - approximately 5 mins later I have
> reached 100% CPU time and am locked out for the rest of the day.
>
> I figure I'll just delete by hand; create some appropriate :delete
> controllers and wait till the next day.
>
> Unfortunately the mapreduce process still seems to be running - 10
> past midnight and my CPU has reached 100% again.
>
> Is there some way to kill these processes and get back control of my
> app?

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to