Glad to hear it. Thanks all  for sharing your  solutions.

Le jeu. 10 mars 2016 19:19, Eran Chinthaka Withana <eran.chinth...@gmail.com>
a écrit :

> Phew, it worked. All I had to do was to add *export
> SPARK_JAVA_OPTS="-Dspark.mesos.executor.docker.image=echinthaka/mesos-spark:0.23.1-1.6.0-2.6"
> *before calling spark-submit. Guillaume, thanks for the pointer.
>
> Timothy, thanks for looking into this. Looking forward to see a fix soon.
>
> Thanks,
> Eran Chinthaka Withana
>
> On Thu, Mar 10, 2016 at 10:10 AM, Tim Chen <t...@mesosphere.io> wrote:
>
>> Hi Eran,
>>
>> I need to investigate but perhaps that's true, we're using
>> SPARK_JAVA_OPTS to pass all the options and not --conf.
>>
>> I'll take a look at the bug, but if you can try the workaround and see if
>> that fixes your problem.
>>
>> Tim
>>
>> On Thu, Mar 10, 2016 at 10:08 AM, Eran Chinthaka Withana <
>> eran.chinth...@gmail.com> wrote:
>>
>>> Hi Timothy
>>>
>>> What version of spark are you guys running?
>>>>
>>>
>>> I'm using Spark 1.6.0. You can see the Dockerfile I used here:
>>> https://github.com/echinthaka/spark-mesos-docker/blob/master/docker/mesos-spark/Dockerfile
>>>
>>>
>>>
>>>> And also did you set the working dir in your image to be spark home?
>>>>
>>>
>>> Yes I did. You can see it here: https://goo.gl/8PxtV8
>>>
>>> Can it be because of this:
>>> https://issues.apache.org/jira/browse/SPARK-13258 as Guillaume pointed
>>> out above? As you can see, I'm passing in the docker image URI through
>>> spark-submit (--conf spark.mesos.executor.docker.
>>> image=echinthaka/mesos-spark:0.23.1-1.6.0-2.6)
>>>
>>> Thanks,
>>> Eran
>>>
>>>
>>>
>>
>

Reply via email to