Hi,    
   I try to launch the pyspark interpretor without success.
I ran the server:

$ bin/zeppelin-daemon.sh start

Running a simple notebook beginning with %pyspark, I got an error about py4j 
not being found. Just did pip install py4j (ref).

Now I'm getting this error:

pyspark is not responding Traceback (most recent call last):
  File "/tmp/zeppelin_pyspark.py", line 22, in <module>
    from pyspark.conf import SparkConf
ImportError: No module named pyspark.conf
So I tried this in my bashrc file took from stackoverflow


SPARK_HOME=/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH
It didn't work.
What I am suppose to do ?
Regards,
Clark

Reply via email to