I've created a backup/restore capability for our customer data using Java on the client side and remote_api on the server.
By default the SDK fetches in batches of 20. When I put data back to the server, I am also doing batches of 20. I worry about hitting per-request limits on data size and CPU time doing it this way. Is there a way of estimating the "best" batch size at run time? -Fred -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
