Hello, I've been doing some tests to see if GAE is feasible as a backend for an iPhone app I'm working on. So far, it appears that CPU Time is going to drive the cost up to beyond Amazon EC2+S3 costs. If I've made a mistake in my math, I'd appreciate any corrections.
I have an app with a single "producer" and single "consumer" running on my local machine. The producer uses remote_api to perform datastore puts and the consumer performs datastore gets. I've run them over the past 17 hours, and the Quota Details page shows 0.2GB incoming and 0.2GB outgoing bandwidth. CPU Time has been 0.84 CPU- hours with 6608 requests. That says I'm using about 458 CPU- milliseconds per request. Google claims in May, their new free quota of 6.5 CPU-hours per day will support 5 million page views per month which means they're estimating 142 CPU-milliseconds per request. If I stick with the 458 CPU-milliseconds per request number, and assume my iPhone app is very popular and I do get 5 million requests per month, I'll consume 21 CPU-hours per day which is close enough to 24 CPU-hours per day that I decided to compare GAE to a dedicated Amazon EC2 instance. I'm going to give Google the benefit of the doubt and assume that I don't exceed the free limits of 1GB data in and 1GB data out per day and that my storage never exceeds the free limit of 1GB. With GAE, I'll get 6.5 CPU-hours / day for free and have to pay for the remaining 17.5 hours / day at $0.10 / h. Over a year, this amounts to $638.75. Since data in, out, and storage are free, my total annual cost with Google is $638.75. With Amazon EC2, I can pay $500 for a 3-year reservation for a CPU instance. This comes out to a fixed annual cost of $166.67 / year. In addition to that, I have to pay for 24h / day of CPU time at a rate of $0.03 / h for the year. This comes out to $262.80. Assuming I actually consume Google's free bandwidth limits of 1GB data in and out, Amazon would charge: 1GB in / day = 365 GB in / year * $0.10 / GB in = $36.50 for bandwidth in / year 1GB out / day = 365 GB out / year * $0.14 / GB out = $51.10 for bandwidth out / year In addition, assuming I store Google's free storage limit of 1GB for the year, with Amazon S3 this would cost: 1GB / day = 30.4 GB/month * $0.15 / GB / month = $4.56 / month = $54.75 for storage / year Amazon total annual cost: $166.67 + $262.80 + $36.50 + $51.10 + $54.75 = $571.82 So, GAE would cost $638.75 per year where Amazon would cost $571.82. Despite all the freebies Google's throwing in, Amazon is still cheaper. In addition, I can't send iPhone push notices from GAE where I can from EC2. I really like the GAE concept, but I just can't seem to make the math work out. I realize there are lots of assumptions in here and 5 million page hits per month may be a gross assumption, but in my testing, it doesn't take much to send CPU Time though the roof. My "producer" was originally a cron task that did an urlfetch and performed several datastore puts. I was getting timeouts and CPU times up to 30s / request. I converted it to use remote_api and did some optimization on the code and now I have it down to something more reasonable (occasionally, peaking to 2s / request), but still, my "consumer" task is just a simple get that performs a query on the datastore returning between 10 and 20 entities. The log confirms that it's taking around 400 CPU-milliseconds per request. What am I missing? --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~----------~----~----~----~------~----~------~--~---
