Thanks for response. We have tried it in dev env. For production, if Spark 3.0 
is not leveraging k8s scheduler, then would Spark Cluster in K8s be "static"? 
As per https://issues.apache.org/jira/browse/SPARK-24432 it seems it is still 
blocker for production workloads?

Thanks,
Vaibhav V

-----Original Message-----
From: Sean Owen <sro...@gmail.com> 
Sent: Thursday, July 9, 2020 3:20 PM
To: Varshney, Vaibhav (DI SW CAS MP AFC ARC) <vaibhav.varsh...@siemens.com>
Cc: user@spark.apache.org; Ramani, Sai (DI SW CAS MP AFC ARC) 
<sai.ram...@siemens.com>
Subject: Re: [Spark 3.0 Kubernetes] Does Spark 3.0 support production deployment

I haven't used the K8S scheduler personally, but, just based on that comment I 
wouldn't worry too much. It's been around for several versions and AFAIK works 
fine in general. We sometimes aren't so great about removing "experimental" 
labels. That said I know there are still some things that could be added to it 
and more work going on, and maybe people closer to that work can comment. But 
yeah you shouldn't be afraid to try it.

On Thu, Jul 9, 2020 at 3:18 PM Varshney, Vaibhav <vaibhav.varsh...@siemens.com> 
wrote:
>
> Hi Spark Experts,
>
>
>
> We are trying to deploy spark on Kubernetes.
>
> As per doc http://spark.apache.org/docs/latest/running-on-kubernetes.html, it 
> looks like K8s deployment is experimental.
>
> "The Kubernetes scheduler is currently experimental ".
>
>
>
> Spark 3.0 does not support production deployment using k8s scheduler?
>
> What’s the plan on full support of K8s scheduler?
>
>
>
> Thanks,
>
> Vaibhav V

Reply via email to