Perhaps you could try uploading the csv to the blobstore, then parsing it out in chunks. I've used this method to upload lots of data before. It is simple to set up and pretty fast. Not sure you are going to be able to import 200k records under the free quota though.
Robert On Wed, Aug 10, 2011 at 14:26, Niall <[email protected]> wrote: > OK. I was able to get something working. I used the bulk-upload thing. but > the problem is that there are over 200k items, and I used 100% of my quota > to upload uploading less than 10%. > Is there a way of doing this better? Or do I have to wait 10 days to upload > everything? > Also, what's more processor intensive--posting to the remote api, or > parsing? > > -- > You received this message because you are subscribed to the Google Groups > "Google App Engine" group. > To view this discussion on the web visit > https://groups.google.com/d/msg/google-appengine/-/yk4A4bftfF4J. > To post to this group, send email to [email protected]. > To unsubscribe from this group, send email to > [email protected]. > For more options, visit this group at > http://groups.google.com/group/google-appengine?hl=en. > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
