If you want to see how your site responds to bot crawling, go to
http://www.xml-sitemaps.com/ and generate a site map for your project.

When I first did this for http://www.bookdope.com, I almost
immediately pushed my site over quota.  I got tons of "...used a high
amount of CPU, and was roughly X.x times over the average request CPU
limit..." warnings and then my site would report an over quota
message.  I lamented "oh my gosh, I've wasted my time on Google App
Engine.  it's not working."

So, I went back to the drawing board...

I liberally used memcache all over the place and switched from pyAWS
(an amazon associates web service wrapper) to basic xml.parsers.expat
xml reading for my two most expensive web service operations (item
lookup and item search).

I am pleased to report, my last site map generation ( using
http://www.xml-sitemaps.com/ ) of the max links (500) produced only 2
CPU warnings!  I even browsed the site myself during the crawl and
benefited from the cached pages.

In conclusion, I'm not lamenting any more.  I'm quite pleased; knowing
that if I keep my CPU usage down (per request), I'm going to scale as
advertised.

Thank You Google App Engine!

Dale
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to