I think the answer is yes.
Code packaged in pyspark.zip needs python to execute.
On Tue, Sep 29, 2015 at 2:08 PM, Ranjana Rajendran <
ranjana.rajend...@gmail.com> wrote:
> Hi,
>
> Does a python spark program (which makes use of pyspark ) submitted in
> cluster mode need python on the executor
Hi,
Does a python spark program (which makes use of pyspark ) submitted in
cluster mode need python on the executor nodes ? Isn't the python program
interpreted on the client node from where the job is submitted and then the
executors run in the JVM of each the executor nodes ?
Thank you,
Thank you Ted.
I have Python 2.6 on all the nodes including the client node. I want to
instead use Python 2.7. For the PySpark shell, I was able to do this by
downloading python 2.7.8 and installing it in a root based out of my home
directory and setting PYSPARK_PYTHON to ~/python2.7/bin/python