Hi all, I just did some optimization these days, now the Datastore Reads operations per day drops from 70k to less than 40k. I achieved this by 2 ways:
1. Hooking every datastore operation, logging the requests which fetched more than 5 entities, then checked each of them if I forgot to cache. 2. Increased the expiration time of some cache. But I think I still can't keep it free when I get 2x traffics. I didn't merge entities together because it will make my code hard to understand and maintain. The max entity size also prevented me to do this. My concern is that the Datastore Reads operations seems cheap ($0.7 / million), why not give us more free quota? I think 500k is reasonable for most of the small apps. I only use 5% CPU (0.3 CPU hours) per day, I don't care SLA, why such a small app have to pay $9 per month? ---------- keakon My blog(Chinese): www.keakon.net Blog source code: https://bitbucket.org/keakon/doodle/ On Wed, Sep 7, 2011 at 6:07 PM, Gerald Tan <[email protected]> wrote: > I've thought of caching to the blobstore too, for stuff that will include a > lot of entities and don't change a lot. > Has anyone tried this, is it plausible? > > -- > You received this message because you are subscribed to the Google Groups > "Google App Engine" group. > To view this discussion on the web visit > https://groups.google.com/d/msg/google-appengine/-/C2DjBaNsCW8J. > > To post to this group, send email to [email protected]. > To unsubscribe from this group, send email to > [email protected]. > For more options, visit this group at > http://groups.google.com/group/google-appengine?hl=en. > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
