You can set spark.files in interpreter setting to achieve the same purpose.


Best Regard,
Jeff Zhang


From: Meethu Mathew <meethu.mat...@flytxt.com<mailto:meethu.mat...@flytxt.com>>
Reply-To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Date: Friday, March 17, 2017 at 5:28 PM
To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Subject: --files in SPARK_SUBMIT_OPTIONS not working - ZEPPELIN-2136

Hi,

Acc to the zeppelin documentation, to pass a python package to zeppelin pyspark 
interpreter, you can export it through --files option in SPARK_SUBMIT_OPTIONS 
in conf/zeppelin-env.sh.

When I add a .egg file through the --files option in SPARK_SUBMIT_OPTIONS , 
zeppelin notebook is not throwing error, but I am not able to import the module 
inside the zeppelin notebook.

Spark version is 1.6.2 and the zepplein-env.sh(version 0.7.0) file looks like:
export SPARK_HOME=/home/me/spark-1.6.1-bin-hadoop2.6
export SPARK_SUBMIT_OPTIONS="--jars 
/home/me/spark-csv-1.5.0-s_2.10.jar,/home/me/commons-csv-1.4.jar --files 
/home/me/models/Churn/package/build/dist/fly_libs-1.1-py2.7.egg"

Any progress in this ticket 
ZEPPELIN-2136<https://issues.apache.org/jira/browse/ZEPPELIN-2136> ?


Regards,
Meethu Mathew

Reply via email to