My current apps use tiny amounts of memory, so that would be good for me too. :D However, to be fair, apps that consume huge amounts of CPU should have some form of quota too.
The problems with the idea of paying for instances include, but are perhaps not limited to: 1. No incentive for Google to make their scheduler better (the more instances the scheduler creates, the more money Google makes and the more extra the customers have to pay). 2. Unless carefully defined, the idea of an 'instance' is an arbitrary concept in terms of how many instances are crammed into each server, how much effective CPU time and other resources each instance gets in reality, and how much overhead is included. A term that Google can change their definition of at their own whims. That can be opaque as hell. And here again, if the customers pay for all the overhead caused by the GAE infrastructure software, then there is no incentive for Google to improve the performance of that code. 3. Unclear and difficult for customers to estimate how many instances they will have to pay for each month even when they know the average load of their application, since the spawning of new instances is determined by Google's code, not the application code. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
