option 6) would be use ajax to get seperate handlers to build each
chunk of the file and save it in the datastore and/or memcache. When
the all parts have been built, have a handler that fetches each chunk
and simply outputs. Single handlers can send 10Mb in one request. That
way you only do the concatenation/append of the chunks at the last
minute.






On 03/04/2009, Andrew Fong <[email protected]> wrote:
>
>  Here's my scenario: every once in a while, I want to let certain users
>  dump a large number of records from the datastore (e.g. 1000+) into a
>  CSV file.
>
>  I've already figured out how to get around the 1000-record limit by
>  using a time-based auto-increment field. I query the database multiple
>  times, increasing the offset on that field until I've queried all the
>  records. There's a good chance I'll have more data than can fit within
>  the constraints of one GAE request-response cycle, so I'm using AJAX
>  to spread the calls out over several requests.
>
>  Here's where I'm stuck. I have a setup where I'm fetching 1000 records
>  at a time and would like to concatenate all these fetches  to generate
>  my CSV file. On a traditional setup, I could just save a file to disk
>  and keep appending stuff to it before returning it to the user. On
>  GAE, I can't store anything larger than 1 MB in the datastore. I know
>  the GAE team plans to offer large file storage at some point down the
>  road, but I need this in the near future so I'd like to see what
>  workarounds people are trying right now.
>
>  Some options I'm considering ...
>
>  1) Use AJAX calls to download small chunks, one at a time to the
>  client. Hard part is figuring out how to use client-side scripting to
>  save data to the user's file system without running afoul of browser
>  security restrictions.
>
>  2) Use AJAX calls to download small chunks, concatenate them, and then
>  ask the user to paste the text into a text editor and save it as a
>  CSV. Hard part is doing this in a way that's user friendly.
>
>  3) Force the user to download multiple smaller CSV files. Again, not
>  user friendly.
>
>  4) Save to S3. Hard part is that S3 doesn't allow me to append to a
>  file already in S3, something that would have been nice considering
>  I'm spreading out small 1000-record writes to the same file over
>  multiple requests.
>
>  5) Set up a separate box that GAE hits up when it needs to do an
>  operation like this. Not hard to do but very annoying. Sorta defeats
>  the purpose of App Engine.
>
>  Am I missing any options here? Thoughts?
>
>  -- Andrew
>  >
>


-- 
Barry

- www.nearby.org.uk - www.geograph.org.uk -

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to