Storage density is going up, networks are getting faster, datasets are
growing larger, and verbose text-based serialization formats are the
dominant interchange format.  1 megabyte is starting to look awfully
small.  But this is the GAE limit for both URLFetch requests and for
user-facing requests.

I'm about to launch a feature that moves a lot of JSON-encoded polygon
data through appengine.  The vast majority of my requests will fit
into 1M, but the outliers might get uncomfortably close to this limit.
 All it would take is several polygons with large numbers of points
(or very long descriptions).

First question:  Does the 1M limit apply before or after gzipping?
I'm guessing the answer is before, which is unfortunate because JSON
polygon data will probably compress to a small fraction its original
size.  My clients will happily download a couple hundred k of polygon
data if appengine can send it.

Second question:  Is it possible to start a discussion of raising this
limit?  Even just doubling to 2M would provide some breathing room.

Thanks,
Jeff

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to