Hi Locke,

On Tue, Sep 15, 2009 at 6:15 PM, Locke <[email protected]> wrote:

>
> I would not be returning thousands of rows to users. This would be for
> background tasks executed via cron.
>
> The hard limits appengine imposes (30 seconds, 1000 query results,
> etc.) all seem quite reasonable for interactive apps, but for non-
> interactive background tasks, it looks like a lot of hacking will be
> required to get around those limits.
>

Splitting your background tasks into small chunks is a good practice in
general. It means that (often) they can be processed in parallel, and
doesn't require arbitrarily long running processes to sort through your
data. This is the approach taken by frameworks like MapReduce.

-Nick

>
> It would be nice if appengine were to someday support longer-running,
> bigger background tasks. Until then, it sounds like we will have to
> basically hash our datastores, chain TaskQueue operations, and do
> other shenanigans if we want to run big background jobs.
>
>
>
> On Sep 15, 11:21 am, "Nick Johnson (Google)" <[email protected]>
> wrote:
> > Hi Locke,
> > If you want to write such a wrapper, you're quite welcome to - it's
> > certainly not a violation of the TOS or anything like that. We'd
> encourage
> > you to examine why you need to do this, though - do you really need to
> > return several thousand results in a single request? Will your users be
> able
> > to use an interface that includes thousands of items?
> >
> > -Nick
> >
> >
> >
> > On Tue, Sep 15, 2009 at 3:53 PM, Locke <[email protected]> wrote:
> >
> > > Interesting. That seems kinda tedious.
> >
> > > So suppose someone were to write a wrapper around the datastore get
> > > API that handled the 1000 entity limit by automatically splitting
> > > queries into 1k groups, but returned them seamlessly to the user as if
> > > they were one query. Would that be a violation of the appengine TOS?
> > > Would users of such a wrapper risk having their app shut off?
> >
> > > On Sep 15, 12:48 am, 风笑雪 <[email protected]> wrote:
> > > > You can sort the entities by __key__, and get the first 1000
> entities.
> > > > Then add a filter for __key__ that bigger(or smaller) than the 1000th
> > > > entity, to get 1001st to 2000th entities, and so on.
> >
> > > > 2009/9/15 Locke <[email protected]>:
> >
> > > > > From the documentation: "maximum number of entities in a batch get:
> > > > > 1000"
> >
> > > > > Does this mean that if I have say 5000 accounts which I am keeping
> > > > > track of in the datastore, and I do a query to list all of them, it
> > > > > would not work? I would have to do five queries in a row, hard
> coding
> > > > > distinct queries to get account numbers 0-999, 1000-1999,
> 2000-2999,
> > > > > 3000-3999, and 4000-4999?
> >
> > --
> > Nick Johnson, Developer Programs Engineer, App Engine
> > Google Ireland Ltd. :: Registered in Dublin, Ireland, Registration
> Number:
> > 368047
> >
>


-- 
Nick Johnson, Developer Programs Engineer, App Engine
Google Ireland Ltd. :: Registered in Dublin, Ireland, Registration Number:
368047

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to