h'mm can you be more specific about how your app is "dying"
after 92 writes?

here are the limits I am aware of (they all throw exceptions):

1) maximum # of objects per batch write: 500
   (JDO makePersistentAll(...)

2) Datastore operation timeout (5 seconds)

3) HTTP request timeout (30 seconds)

In my application I batched my writes (using JDO makePersistentAll())
at 500 objects per call, those were timing out (5 sec DS timeouts)
so I reduced the size of the batch to avoid this ... then I started
to
hit the 3rd timeout (30 second HTTP request)

The "best" solution to this is to have a mechanism which does
background batch uploads (I believe one is in the works).

The solution for Python is to use a clientside RDBMS to
"transactionally" fragment and upload the content into the
DS ...

another solution would be to upload the "file" into memcache (assuming
that operation does not exceed the 30 sec HTTP limit)
and then "background" process the file from the cache to the DS
there is another thread discussing this feature elsewhere ...


On Sep 14, 12:23 pm, hg <[email protected]> wrote:
> Thank you for replying.
> I tried batching the writes, and it didn't solve the problem. However,
> it did show me that my script is dying after I create 92 (exactly)
> entities, whether or not they have all been persisted. Could this be
> connected to the 2k per request? What exactly is the 2k per request?
> Thanks!
>
> On Sep 14, 7:01 pm, Larry Cable <[email protected]> wrote:
>
>
>
> > read this thread ...
>
> > you might want to batch the writes you should get around 200-300
> > objects per write (under the 5 sec timeout)
> > and probably around 2k per request before hitting the 30sec request
> > timeout (all depends on the complexity of
> > your objects)
>
> > On Sep 14, 5:54 am, Iain <[email protected]> wrote:
>
> > > What do you mean by times out? Do you mean you hit the 30 second
> > > deadline?
>
> > > On Sep 14, 3:59 am, hg <[email protected]> wrote:
>
> > > > Hi,
> > > > I am writing a script that is supposed to run quite a few inserts to
> > > > the datastore - up to a couple of hundred. My script keeps timing out
> > > > at 92. After I ran it a few times this evening, it wiped out my entire
> > > > datastore! I did some research and came across the following error
> > > > documentation: for the DatastoreTimeoutException
> > > > DatastoreTimeoutException is thrown when a datastore operation times
> > > > out. This can happen when you attempt to put, get, or delete too many
> > > > entities or an entity with too many properties, or if the datastore is
> > > > overloaded or having trouble.
> > > > I was not getting this error, but it does seem to have some bearing on
> > > > this case. What are the limit on entity 'putting' for a script?
>
> > > > Any suggestions would be appreciated.
> > > > Thanks!- Hide quoted text -
>
> > > - Show quoted text -- Hide quoted text -
>
> - Show quoted text -
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to