thanks Stephen - that's just what I was thinking over the weekend because the files may be over 1MB so I would need to split them up anyway.
Richard On Oct 30, 11:41 pm, Stephen <[email protected]> wrote: > On Oct 28, 1:33 pm, Baron <[email protected]> wrote: > > > hello, > > > part of my app uploads a file and needs to parse each line of the > > file. The parsing of the whole document takes longer than 30 seconds, > > so I was wondering how to handle this. > > > Currently I am considering storing the upload in the datastore and > > parse as far as I can. Then when I get DeadlineExceededError I create > > a new Task Queue with the line number to continue from. > > > Is there a problem with this? Or a better way? > > You could experimentally determine how many lines you can parse in > 20-25 wall-clock seconds (within the 30 seconds DeadlineExceededError > limit). > > When the file is uploaded, immediately spit it into chunks of the > above size. Then db.put() all chunks to the db in a single call. > > Create n tasks to process the chunks, one for each chunk, passing a > key to each. > > The above has have the advantage that the client which uploaded the > file gets a response quicker (it does not have to wait for the initial > 30 sec timeout), and the chunks are parsed in parallel (if you > configure the queue to do so). --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~----------~----~----~----~------~----~------~--~---
