Agree. PySpark would call spark-submit. Check out the command line there.

--- Original Message ---

From: "Mohit Singh" <mohit1...@gmail.com>
Sent: February 9, 2015 11:26 PM
To: "Ashish Kumar" <ashish.ku...@innovaccer.com>
Cc: user@spark.apache.org
Subject: Re: ImportError: No module named pyspark, when running pi.py

I think you have to run that using $SPARK_HOME/bin/pyspark /path/to/pi.py
instead of normal "python pi.py"

On Mon, Feb 9, 2015 at 11:22 PM, Ashish Kumar <ashish.ku...@innovaccer.com>
wrote:

> *Command:*
> sudo python ./examples/src/main/python/pi.py
>
> *Error:*
> Traceback (most recent call last):
>   File "./examples/src/main/python/pi.py", line 22, in <module>
>     from pyspark import SparkContext
> ImportError: No module named pyspark
>
>


--
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates

Reply via email to