I've uploaded about 1.4 million objects made of a Long and a String (12 characters on average) today over the span of 6/7 hours. I could not use the bulk uploader since I need my Java objects to use real Long id's and not string's as id's. The upload process would bulk up 500 data items and make one POST request sending the data gzipped. The servlet unzips the data, builds up a collection (fwiw, I'm using an ArrayList) and then submitting all 500 objects in one fell swoop.
So far, I think that this process is normal; if not please do let me know. Now my observations are as such: 1. Inserting 500 objects is taking about 30+ seconds (per the logs in the data store). 2. Nearly 3,000 page loads occurred using up 24.97 CPU/hours (24.76 CPU/hours worth in the datastore). Now this is not a lot of data. After everything has loaded into the datastore it's on the order of 20 megabytes. (Note that no indexes exist for this data.) Are these observations relatively common? Or is this only because of the recent issues of the datastore? If this is not common, does anyone know what I should expect in terms of cpu usage for something as simple as saving this small amount of data? Nate P.S. Is there a convenient way to query for something that matches one condition or another? (like query * where __key__ = KEY(..., 1) or __key__ = KEY(..., 2)) -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
