Re: Spark Kubernetes Volumes

2018-04-12 Thread Anirudh Ramanathan
There's a JIRA SPARK-23529
 that deals with
mounting hostpath volumes.
I propose we extend that PR/JIRA to encompass all the different volume
types and allow mounting them into the driver/executors.

On Thu, Apr 12, 2018 at 10:55 AM Yinan Li  wrote:

> Hi Marius,
>
> Spark on Kubernetes does not yet support mounting user-specified volumes
> natively. But mounting volume is supported in
> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator. Please see
> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/user-guide.md#mounting-volumes
> .
>
> On Thu, Apr 12, 2018 at 7:50 AM, Marius  wrote:
>
>> Hey,
>>
>> i have a question regarding the Spark on Kubernetes feature. I would like
>> to mount a pre-populated Kubernetes volume into the execution pods of
>> Spark. One of my tools that i invoke using the Sparks pipe command requires
>> these files to be available on a POSIX compatible FS and they are too large
>> to justify copying them around using addFile. If this is not possible i
>> would like to know if the community be interested in such a feature.
>>
>> Cheers
>>
>> Marius
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>

-- 
Anirudh Ramanathan


Re: Spark Kubernetes Volumes

2018-04-12 Thread Yinan Li
Hi Marius,

Spark on Kubernetes does not yet support mounting user-specified volumes
natively. But mounting volume is supported in
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator. Please see
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/user-guide.md#mounting-volumes
.

On Thu, Apr 12, 2018 at 7:50 AM, Marius  wrote:

> Hey,
>
> i have a question regarding the Spark on Kubernetes feature. I would like
> to mount a pre-populated Kubernetes volume into the execution pods of
> Spark. One of my tools that i invoke using the Sparks pipe command requires
> these files to be available on a POSIX compatible FS and they are too large
> to justify copying them around using addFile. If this is not possible i
> would like to know if the community be interested in such a feature.
>
> Cheers
>
> Marius
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>