It is dynamic, you can set enviroment variable in interpreter setting page.


Best Regard,
Jeff Zhang


From: Ruslan Dautkhanov <dautkha...@gmail.com<mailto:dautkha...@gmail.com>>
Reply-To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Date: Tuesday, March 21, 2017 at 3:27 AM
To: users <users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Subject: Re: Should zeppelin.pyspark.python be used on the worker nodes ?

You're right - it will not be dynamic.

You may want to check
https://issues.apache.org/jira/browse/ZEPPELIN-2195
https://github.com/apache/zeppelin/pull/2079
it seems it is fixed in a current snapshot of Zeppelin (comitted 3 weeks ago).






--
Ruslan Dautkhanov

On Mon, Mar 20, 2017 at 1:21 PM, William Markito Oliveira 
<william.mark...@gmail.com<mailto:william.mark...@gmail.com>> wrote:
Thanks for the quick response Ruslan.

But given that it's an environment variable, I can't quickly change that value 
and point to a different python environment without restarting the Zeppelin 
process, can I ? I mean is there a way to set the value for PYSPARK_PYTHON from 
the Interpreter configuration screen ?

Thanks,


On Mon, Mar 20, 2017 at 2:15 PM, Ruslan Dautkhanov 
<dautkha...@gmail.com<mailto:dautkha...@gmail.com>> wrote:
You can set PYSPARK_PYTHON environment variable for that.

Not sure about zeppelin.pyspark.python. I think it does not work
See comments in https://issues.apache.org/jira/browse/ZEPPELIN-1265


Eventually, i think we can remove zeppelin.pyspark.python and use only 
PYSPARK_PYTHON instead to avoid confusion.


--
Ruslan Dautkhanov

On Mon, Mar 20, 2017 at 12:59 PM, William Markito Oliveira 
<mark...@apache.org<mailto:mark...@apache.org>> wrote:
I'm trying to use zeppelin.pyspark.python as the variable to set the python 
that Spark worker nodes should use for my job, but it doesn't seem to be 
working.

Am I missing something or this variable does not do that ?

My goal is to change that variable to point to different conda environments.  
These environments are available in all worker nodes since it's on a shared 
location and ideally all nodes then would have access to the same libraries and 
dependencies.

Thanks,

~/William




--
~/William

Reply via email to