Are you doing batch gets/puts? They are done in parallel & might account for the perceived difference in cpu hours. Also, are you taking latency into account? Have you checked the logs? It would also definitely help to know what your server side request are doing.
On Wed, Mar 2, 2011 at 5:58 PM, Eloff <[email protected]> wrote: > Hi, > > I have a little driver script for uploading data. It makes a request > to the GAE app, waits for the response, check it, then does it again. > So it's completely serial, and the GAE request finishes completely, no > tasks are started. So how then in less than 2 hours clock-time, I > burned through 41 CPU hours? How is that possible? Just what exactly > is this CPU hour and how is it calculated? > > Baffled, > Dan > > -- > You received this message because you are subscribed to the Google Groups > "Google App Engine" group. > To post to this group, send email to [email protected]. > To unsubscribe from this group, send email to > [email protected]. > For more options, visit this group at > http://groups.google.com/group/google-appengine?hl=en. > > -- *Jeff Schwartz* http://jefftschwartz.appspot.com/ http://www.linkedin.com/in/jefftschwartz follow me on twitter: @jefftschwartz -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
