Hi There,

There is an issue with PySpark-on-YARN that requires users build with
Java 6. The issue has to do with how Java 6 and 7 package jar files
differently.

Can you try building spark with Java 6 and trying again?

- Patrick

On Fri, Jun 27, 2014 at 5:00 PM, sdeb <sangha...@gmail.com> wrote:
> Hello,
>
> I have installed spark on top of hadoop + yarn.
> when I launch the pyspark shell & try to compute something I get this error.
>
> Error from python worker:
>   /usr/bin/python: No module named pyspark
>
> The pyspark module should be there, do I have to put an external link to it?
>
> --Sanghamitra.
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/hadoop-yarn-spark-tp8466.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to