I can definitely do the work in a task, but the deadline I'm hitting is the
read deadline from the datastore. I believe that will be the same for
tasks, right?

Thanks,
Phil


On Fri, Aug 24, 2012 at 12:18 AM, timh <[email protected]> wrote:

> If your count keeps increasing you will always run into some sort of time
> limit.  Why not consider doing this processing in a task (they can run for
> 10mins, or multiple tasks. )  I am assuming your trying to summarise etc....
>
> T
>
>
> On Friday, August 24, 2012 3:26:00 AM UTC+8, Phil wrote:
>>
>> In some initialization work my app needs to run through the all of the
>> datastore entities of a given kind.  I have a lot of these entities (80k
>> currently) and it's increasing rapidly. I'm currently trying to read these
>> in using a single datastore query, but running up against the default
>> datastore timeout of 30 seconds.
>>
>> Is there a good practice for sharding this or otherwise breaking this up
>> so that I won't hit these deadlines? I was thinking I would do a keyOnly
>> query and then break up the keys into a number of reasonably sized
>> sub-queries, but perhaps there is a better approach out there?
>>
>> Thanks,
>> Phil
>>
>  --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/google-appengine/-/5x0WQc9ODVwJ.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to