I tried. the same error now.

I even tried remove spark.yarn.jar in interpreter.json, it still the same error.



> On Dec 8, 2015, at 5:07 PM, moon soo Lee <leemoon...@gmail.com> wrote:
> 
> Can you not try to set PYTHONPATH but only SPARK_HOME?
> 
> Thanks,
> moon
> 
> 
> On 2015년 12월 8일 (화) at 오후 6:04 Amjad ALSHABANI <ashshab...@gmail.com 
> <mailto:ashshab...@gmail.com>> wrote:
> Hello,
> 
> Are you sure that you ve installed the module pyspark.
> 
> Please check your spark installation directory if you could see the python 
> sub-directory 
> Amjad
> 
> On Dec 8, 2015 9:55 AM, "Fengdong Yu" <fengdo...@everstring.com 
> <mailto:fengdo...@everstring.com>> wrote:
> Hi
> 
> I am using Zeppelin-0.5.5 with Spark 1.5.2
> 
> It cannot find pyspark module.
> 
> 
> Error from python worker:
>  /usr/local/bin/python: No module named pyspark
> PYTHONPATH was:
> 
> 
> 
> I’ve configured pyspark in zeppelin-env.sh:
> 
> export 
> PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$SPARK_HOME/python/lib/pyspark.zip
> 
> 
> any others I skipped? Thanks
> 
> 
> 

Reply via email to