+1 (non-binding)

Looking forward using it as part of Apache Spark release, instead of
Standalone cluster deployed on top of k8s.


--
Alex

On Wed, Aug 16, 2017 at 11:11 AM, Ismaël Mejía <ieme...@gmail.com> wrote:

> +1 (non-binding)
>
> This is something really great to have. More schedulers and runtime
> environments are a HUGE win for the Spark ecosystem.
> Amazing work, Big kudos for the guys who created and continue working on
> this.
>
> On Wed, Aug 16, 2017 at 2:07 AM, lucas.g...@gmail.com
> <lucas.g...@gmail.com> wrote:
> > From our perspective, we have invested heavily in Kubernetes as our
> cluster
> > manager of choice.
> >
> > We also make quite heavy use of spark.  We've been experimenting with
> using
> > these builds (2.1 with pyspark enabled) quite heavily.  Given that we've
> > already 'paid the price' to operate Kubernetes in AWS it seems rational
> to
> > move our jobs over to spark on k8s.  Having this project merged into the
> > master will significantly ease keeping our Data Munging toolchain
> primarily
> > on Spark.
> >
> >
> > Gary Lucas
> > Data Ops Team Lead
> > Unbounce
> >
> > On 15 August 2017 at 15:52, Andrew Ash <and...@andrewash.com> wrote:
> >>
> >> +1 (non-binding)
> >>
> >> We're moving large amounts of infrastructure from a combination of open
> >> source and homegrown cluster management systems to unify on Kubernetes
> and
> >> want to bring Spark workloads along with us.
> >>
> >> On Tue, Aug 15, 2017 at 2:29 PM, liyinan926 <liyinan...@gmail.com>
> wrote:
> >>>
> >>> +1 (non-binding)
> >>>
> >>>
> >>>
> >>> --
> >>> View this message in context:
> >>> http://apache-spark-developers-list.1001551.n3.
> nabble.com/SPIP-Spark-on-Kubernetes-tp22147p22164.html
> >>> Sent from the Apache Spark Developers List mailing list archive at
> >>> Nabble.com.
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>>
> >>
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to