I've been trying to save some data in the datastore following the google 
Bookshelf python example, with the gcloud python library, as explained in 
the tutorial. 

It was working well to save small strings and times. 

The problem is for other formats. For example, if I want to save a big JSON 
: 

    data['result'] = json.dumps(tags[0])
    ds = datastore.Client(current_app.config['PROJECT_ID'])
    if id:
        key = ds.key('Pred', id)
    else:
        key = ds.key('Pred')

    entity = datastore.Entity(
        key=key,
        exclude_from_indexes=['description'])

    entity.update(data)
    ds.put(entity)
    return from_datastore(entity)

It says that the string is too long (more than 1500) to be accepted. 

I've been trying other solutions, such as using the google appengine ndb 
library, but when using this library from a computer outside the compute 
engine, it said there was missing a proxy api for the datastore v3.

Sounds also the example in the tutorial follows DB pratices, and not NDB.

The same question as the question about how to insert a text format field 
in the datastore, is true for other formats.

Thanks

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/88f9625e-d423-417b-8e58-2c0dfa19c629%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to