Hello, I have installed spark on top of hadoop + yarn. when I launch the pyspark shell & try to compute something I get this error.
Error from python worker: /usr/bin/python: No module named pyspark The pyspark module should be there, do I have to put an external link to it? --Sanghamitra. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/hadoop-yarn-spark-tp8466.html Sent from the Apache Spark User List mailing list archive at Nabble.com.