It is dynamic, you can set enviroment variable in interpreter setting page.


Best Regard,
Jeff Zhang


From: Ruslan Dautkhanov <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Date: Tuesday, March 21, 2017 at 3:27 AM
To: users <[email protected]<mailto:[email protected]>>
Subject: Re: Should zeppelin.pyspark.python be used on the worker nodes ?

You're right - it will not be dynamic.

You may want to check
https://issues.apache.org/jira/browse/ZEPPELIN-2195
https://github.com/apache/zeppelin/pull/2079
it seems it is fixed in a current snapshot of Zeppelin (comitted 3 weeks ago).






--
Ruslan Dautkhanov

On Mon, Mar 20, 2017 at 1:21 PM, William Markito Oliveira 
<[email protected]<mailto:[email protected]>> wrote:
Thanks for the quick response Ruslan.

But given that it's an environment variable, I can't quickly change that value 
and point to a different python environment without restarting the Zeppelin 
process, can I ? I mean is there a way to set the value for PYSPARK_PYTHON from 
the Interpreter configuration screen ?

Thanks,


On Mon, Mar 20, 2017 at 2:15 PM, Ruslan Dautkhanov 
<[email protected]<mailto:[email protected]>> wrote:
You can set PYSPARK_PYTHON environment variable for that.

Not sure about zeppelin.pyspark.python. I think it does not work
See comments in https://issues.apache.org/jira/browse/ZEPPELIN-1265


Eventually, i think we can remove zeppelin.pyspark.python and use only 
PYSPARK_PYTHON instead to avoid confusion.


--
Ruslan Dautkhanov

On Mon, Mar 20, 2017 at 12:59 PM, William Markito Oliveira 
<[email protected]<mailto:[email protected]>> wrote:
I'm trying to use zeppelin.pyspark.python as the variable to set the python 
that Spark worker nodes should use for my job, but it doesn't seem to be 
working.

Am I missing something or this variable does not do that ?

My goal is to change that variable to point to different conda environments.  
These environments are available in all worker nodes since it's on a shared 
location and ideally all nodes then would have access to the same libraries and 
dependencies.

Thanks,

~/William




--
~/William

Reply via email to