I have a new record that must eventually get into the database. The plan was to have the on-line handler allocate an id, pass that back to the client, send the POST elements to the task queue for the put(). Various ways of handling immediate put() failures, but we can handle those with a little elegance. These are not high volume writes.
The limit on task queue task size however has thwarted this approach. Only option is to do the put() using the on-line handler. Have gone with a high replication app for this hoping to minimize the risk, but right now I can see no resolution for the corner cases where the on- line handler put() fails. Handling this is going to be much less elegant. The best option I believe will be to use AWS for these situations. If I am missing something about how to ensure a put() takes place please advise. If not, then I would suggest that GAE engineers consider this issue. It is not common-sense to have this limited mix of options for high- value, low-volume puts. Forcing your customers to use a competitor's product is likely not a route maximizing your product's success. My solution would be simple: a task queue without the 10K limit. Constrain some other element -- perhaps the number of items that can be queued. We should be able to have one queue where we can pass all the POST data received to a task, and leverage the task queues ability to retry until successful completion. Thanks, stevep -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
