We have applications that do large batches of work - mapper-driven,
fantasm-driven, other custom fan-out driven. It is a very, very
powerful feature of App Engine to be able to scale out massively to
handle these large jobs in a single spike of work. No other platform
provides this capability and it is a very important part of how our
applications operate.

I really, really hope the scheduler can be made very aggressive. Maybe
there is a way to identify one of these jobs (a custom HTTP header)
and allow the scheduler to be very aggressive spinning up and spinning
down these instances? The nature of these jobs are such that state is
not important; we just want many, many instances to do a very small
chunk of work in parallel. I realize that the startup requests for
these instances needs to be slimmed up to provide this rapid ramp-up.

Does anyone have any thoughts as to how we might approach this under
the new billing model?

j

On May 20, 4:49 pm, nickmilon <[email protected]> wrote:
> Very interesting question:
> "Is MapReduce still a flexible solution on AppEngine under the new
> pricing model ?"
> My answer: probably not, new pricing model makes mapreduce operations
> a no - no. Price will be prohibitive for such operation especially
> ones that depend  on many instances to run a job fast, unless those
> used to take hours rather than minutes to complete.
> So I guess the team can drop the "reduce" part and query based
> mapreduce things from roadmap, new model renders  those irrelevant for
> most use cases.
> Also drawing a "danger - high $$$" icon as a precaution next to copy/
> delete model buttons on control panel would be a good idea.
>
> Nick
>
> On May 20, 3:56 pm, "Raymond C." <[email protected]> wrote:
>
>
>
> > As I know MapReduce rely on a relative large number of instances (on top of
> > the normal traffic) to perform the calculation efficiently in parallel.
> >  Under the new pricing model each instance will cost you 15min idle time
> > after the job is done.  Therefore 15min times n instances are wasted (cost
> > you without using them).  If n=8 (for a relatively small and slow task),
> > there will be an additional cost of $0.16 just for one MapReduce operation.
> >  It will be very costly if you are doing sth like hourly job like reporting.
> >  8 instances will cost you $115.2/month for hourly MapReduce task, which is
> > *in additional* to the cost of the actual run time, just for MapReduce
> > tasks.
>
> > My question is, is it still a flexible mechanism on AppEngine?  Or we should
> > rely on external service to do these kind of calculation? (complex but could
> > be more cost effective?)

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to