Hi Clark,

How did you build your zeppelin binaries? Did you configure pyspark interpreter 
manually?


To configure pyspark automatically while building binaries use below
mvn  clean package -Pspark-1.3  -Ppyspark  -Dhadoop.version=2.6.0-cdh5.4.2 
-Phadoop-2.6 –DskipTests


try above, still if you face same issue. Let me know.


Thanks
Karthik


From: clark djilo kuissu [mailto:djilokui...@yahoo.fr]
Sent: Tuesday, August 4, 2015 4:49 AM
To: Users; Moon Soo Lee
Subject: Launch Pyspark Interpretor

Hi,

   I try to launch the pyspark interpretor without success.

I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j 
not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf

So I tried this in my bashrc file took from stackoverflow


SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH

It didn't work.

What I am suppose to do ?

Regards,

Clark

Reply via email to