Hi Niall,
There would be little point in doing it locally (aside from testing,
which I would encourage). You'd then wind up using the same tools...
gaining you nothing.
The process could look something like this:
1) upload the CSV as one big blob to the blobstore.
2) Start a task to process the blob.
2.1) Read a chunk of data from the blob, maybe around 100 entities worth.
2.2) Parse that data into entities, write those entities in bulk
using db.put.
2.3) Insert a new task to resume at step 2.1 where this one left off.
You'll just need to make sure you handle the cases when the data you
fetch ends with a partial record. It isn't difficult logic. You will
also want to adjust the approximate batch size you're doing in each
put based on your data (number of indexes, etc...).
Robert
On Thu, Aug 11, 2011 at 09:10, Niall <[email protected]> wrote:
> I'll try that now.
> I assume it's in danger of timing out if I leave the whole thing run. What
> might be a safe amount of data to run through every time?
> I was hoping I might be able to make the datastore locally in a test
> (localhost:8080 or something). Then download that. And upload that to my
> appspot. Might this be possible? Where are the local datastores stored?
> Thanks,
> Niall
>
> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/google-appengine/-/FSSFIhfeS4wJ.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.
>
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.