Oh interesting, given that pyspark was working in spark on kub 2.2 I
assumed it would be part of what got merged.

Is there a roadmap in terms of when that may get merged up?

Thanks!



On 2 March 2018 at 21:32, Felix Cheung <felixcheun...@hotmail.com> wrote:

> That’s in the plan. We should be sharing a bit more about the roadmap in
> future releases shortly.
>
> In the mean time this is in the official documentation on what is coming:
> https://spark.apache.org/docs/latest/running-on-kubernetes.
> html#future-work
>
> This supports started as a fork of the Apache Spark project and this fork
> has dynamic scaling support you can check out here:
> https://apache-spark-on-k8s.github.io/userdocs/running-on-
> kubernetes.html#dynamic-executor-scaling
>
>
> ------------------------------
> *From:* Lalwani, Jayesh <jayesh.lalw...@capitalone.com>
> *Sent:* Friday, March 2, 2018 8:08:55 AM
> *To:* user@spark.apache.org
> *Subject:* Question on Spark-kubernetes integration
>
>
> Does the Resource scheduler support dynamic resource allocation? Are there
> any plans to add in the future?
>
> ------------------------------
>
> The information contained in this e-mail is confidential and/or
> proprietary to Capital One and/or its affiliates and may only be used
> solely in performance of work or services for Capital One. The information
> transmitted herewith is intended only for use by the individual or entity
> to which it is addressed. If the reader of this message is not the intended
> recipient, you are hereby notified that any review, retransmission,
> dissemination, distribution, copying or other use of, or taking of any
> action in reliance upon this information is strictly prohibited. If you
> have received this communication in error, please contact the sender and
> delete the material from your computer.
>

Reply via email to