Obiously we're going to have to do some optimization work before we're production-ready. My general approach to developing software is: (1) make it work; (2) make it work well; (3) make it work fast. We're still only at step (1), but thanks for reporting this.
Nevertheless, while I was in there fixing the GoogleDelete problem, I noticed a small optimization I could make when reading from the datastore. This has been committed and will be available tomorrow (June 10). I'll be curious to know if this has any effect on your test. Vince On Sun, Jun 7, 2009 at 5:46 PM, Baz <[email protected]> wrote: > I have a simple test site (gae.thinkloop.com) that stores some cgi info > for every visitor to the site (in the datastore), then queries them all for > display. Every additional record makes the page load slower and slower until > finally it craps out at a little over 300 records with this eror: > com.google.appengine.api.datastore.DatastoreTimeoutException: > datastore timeout: operation took too long. > > Google's limit is 1000 records so I am wondering if OpenBD adds some > overhead that won't allow it to reach the max - or whether the 1000 record > limit is only theoretical and could only be achieved with the most basic > recordset if no-one is using GAE at the moment. > > Anyways just reporting. > > Baz --~--~---------~--~----~------------~-------~--~----~ Open BlueDragon Public Mailing List http://groups.google.com/group/openbd?hl=en official site @ http://www.openbluedragon.org/ !! save a network - trim replies before posting !! -~----------~----~----~----~------~----~------~--~---
