You can set PYSPARK_PYTHON environment variable for that.

Not sure about zeppelin.pyspark.python. I think it does not work
See comments in https://issues.apache.org/jira/browse/ZEPPELIN-1265

Eventually, i think we can remove zeppelin.pyspark.python and use only
PYSPARK_PYTHON instead to avoid confusion.


-- 
Ruslan Dautkhanov

On Mon, Mar 20, 2017 at 12:59 PM, William Markito Oliveira <
[email protected]> wrote:

> I'm trying to use zeppelin.pyspark.python as the variable to set the
> python that Spark worker nodes should use for my job, but it doesn't seem
> to be working.
>
> Am I missing something or this variable does not do that ?
>
> My goal is to change that variable to point to different conda
> environments.  These environments are available in all worker nodes since
> it's on a shared location and ideally all nodes then would have access to
> the same libraries and dependencies.
>
> Thanks,
>
> ~/William
>

Reply via email to