I had a similar situation with a list of variable length lists of variable
length dictionaries. I used pickle to pack the data into a single
structure. I then sliced the pickled data into smaller (<1MB) chunks and
stored each chunk in an entity which contains a blobstore and a sequence
number.


On 15 January 2013 14:02, wwwtyro <[email protected]> wrote:

> It looks like I have a couple of options:
>
> 1. use the datastore, but break the data into smaller pieces
> 2. use google cloud sql (Are there per-row limits?)
>
> And a couple that look less likely to be workable:
>
> 3. Somehow use the blobstore within a transaction?
> 4. Somehow use google cloud storage within a transaction?
>
> The data I am storing is json. I've been using a JsonProperty in the
> datastore, until I realized I was not going to be able to go over the
> limit. If I did use option 1, how would I break up a json string into
> roughly 1MB blocks for efficient storage? How do I do it efficiently when
> using some kind of compression (and how would I use compression)?
>
> I'm also not attached to any of these solutions. Whatever best (and most
> simply) solves the problem in the subject is what I want to hear. :) Thanks
> for any help!
>
> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/google-appengine/-/LF9gzpusTO8J.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to