Adding to Ian's suggesting, you might want to look at Query Cursors
and
pre-prepare reports ahead of time...

or pre-prepare reports iteratively, compress, store as blob and
download...


On May 31, 1:24 pm, Ian Marshall <[email protected]> wrote:
> You can reduce time-out issues if you paginate your query by obtaining
> contiguous chunks, one at a time. A lot depends on your data exchange
> interface. I use JDO, and GAE/J makes query cursors available for
> this. I don't know how other interfaces allow cursor/pagination
> operations. (Twig and Objectify have good reputations.)
>
> You can use deferred tasks to break work up into smaller chunks. I
> haven't looked at MapReduce. This might help you too?
>
> Does this help?
>
> On May 30, 7:13 am, Jacob <[email protected]> wrote:
>
>
>
>
>
>
>
> > I have a Java application that has approximately 100 users. In this
> > application there is a table that would have the equivalent of 100,000
> > entities added per day, ie 100 users each doing 1000 inserts per day.
>
> > From time to time I will need to output a CSV file that shows one
> > months worth of entries for a particular user, ie would result in a
> > file with 30,000 entries.
>
> > If I understand correctly, The entities would be given an anscestor
> > record to allow querying the transactions by user, and then filter
> > them by month.
>
> > Am I going to have timeout issues with querying by user+month, and for
> > in the case where I need to export a month's worth of data?
>
> > Any feedback much appreciated!

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.

Reply via email to