Hi Larry > I am wondering about writing a Servlet that would form/multi-part > upload large files and cache them in memcache then use the > cron API to "trickle" persist them into the DS over time ...
I've been thinking about using something like this as well. I think you could likely cache the upload to the store because the limit here seems to be mainly the amount of entities, not the size of one entity (below 1mb). I have e.g. 100/200k worth of data that I upload, but because it's represented as a couple hundred entities it chokes. I could just upload the 93k and fire off a task (or cron job) that would parse and insert the data offline. At the very least, I plan to use the low-level api more. The (very useful) performance testing app http://gaejava.appspot.com/ shows consistently higher CPU usage from JDO. If this ever improves, that app should show it. Until then, low-level looks good. Regards, Richard --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google App Engine for Java" group. To post to this group, send email to google-appengine-java@googlegroups.com To unsubscribe from this group, send email to google-appengine-java+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine-java?hl=en -~----------~----~----~----~------~----~------~--~---