[ 
https://issues.apache.org/jira/browse/SPARK-5808?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14490536#comment-14490536
 ] 

Apache Spark commented on SPARK-5808:
-------------------------------------

User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/5461

> Assembly generated by sbt does not contain pyspark
> --------------------------------------------------
>
>                 Key: SPARK-5808
>                 URL: https://issues.apache.org/jira/browse/SPARK-5808
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>            Reporter: Marcelo Vanzin
>            Priority: Minor
>
> When you generate the assembly with sbt, the py4j and pyspark files are not 
> added to it. This makes pyspark not work when you run that assembly with 
> yarn, since SPARK_HOME is not propagated in Yarn and thus PythonUtils.scala 
> does not add the needed pyspark paths to PYTHONPATH.
> This is minor since all released bits are created by maven, so this should 
> only affect developers who build with sbt and try pyspark on yarn.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to