I think you have to run that using $SPARK_HOME/bin/pyspark /path/to/pi.py
instead of normal "python pi.py"

On Mon, Feb 9, 2015 at 11:22 PM, Ashish Kumar <ashish.ku...@innovaccer.com>
wrote:

> *Command:*
> sudo python ./examples/src/main/python/pi.py
>
> *Error:*
> Traceback (most recent call last):
>   File "./examples/src/main/python/pi.py", line 22, in <module>
>     from pyspark import SparkContext
> ImportError: No module named pyspark
>
>


-- 
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates

Reply via email to