Moon,

I can run the same code on the pyspark shell. but failed on Zeppelin.




> On Dec 8, 2015, at 7:43 PM, moon soo Lee <m...@apache.org> wrote:
> 
> Tried with 0.5.5-incubating release after adding SPARK_1_5_2 in 
> spark/src/main/java/org/apache/zeppelin/spark/SparkVersion.java.
> 
> My conf/zeppelin-env.sh has only SPARK_HOME that points spark 1.5.2 
> distribution. And i could able to run %pyspark without any problem.
> 
> when you run
> 
> System.getenv("PYTHONPATH")
> 
> in the notebook, what do you see? can you check those files and dirs are 
> exists?
> 
> Thanks,
> moon
> 
> On Tue, Dec 8, 2015 at 6:22 PM Fengdong Yu <fengdo...@everstring.com 
> <mailto:fengdo...@everstring.com>> wrote:
> I tried. the same error now.
> 
> I even tried remove spark.yarn.jar in interpreter.json, it still the same 
> error.
> 
> 
> 
>> On Dec 8, 2015, at 5:07 PM, moon soo Lee <leemoon...@gmail.com 
>> <mailto:leemoon...@gmail.com>> wrote:
>> 
>> Can you not try to set PYTHONPATH but only SPARK_HOME?
>> 
>> Thanks,
>> moon
>> 
>> 
>> On 2015년 12월 8일 (화) at 오후 6:04 Amjad ALSHABANI <ashshab...@gmail.com 
>> <mailto:ashshab...@gmail.com>> wrote:
>> Hello,
>> 
>> Are you sure that you ve installed the module pyspark.
>> 
>> Please check your spark installation directory if you could see the python 
>> sub-directory 
>> Amjad
>> 
>> On Dec 8, 2015 9:55 AM, "Fengdong Yu" <fengdo...@everstring.com 
>> <mailto:fengdo...@everstring.com>> wrote:
>> Hi
>> 
>> I am using Zeppelin-0.5.5 with Spark 1.5.2
>> 
>> It cannot find pyspark module.
>> 
>> 
>> Error from python worker:
>>  /usr/local/bin/python: No module named pyspark
>> PYTHONPATH was:
>> 
>> 
>> 
>> I’ve configured pyspark in zeppelin-env.sh:
>> 
>> export 
>> PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$SPARK_HOME/python/lib/pyspark.zip
>> 
>> 
>> any others I skipped? Thanks
>> 
>> 
>> 
> 

Reply via email to