[ 
https://issues.apache.org/jira/browse/SPARK-1573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13979569#comment-13979569
 ] 

Sean Owen commented on SPARK-1573:
----------------------------------

I think it's to be expected that you have to run tests the same way you build. 
This is what 
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark directs 
you to do in step 5 already.

Maybe it could be completely explicit that you can do it in one command 
(sbt/sbt assembly test), meaning the properties must necessarily be the same. I 
don't have edit rights or I'd just adjust it.

> slight modification with regards to sbt/sbt test
> ------------------------------------------------
>
>                 Key: SPARK-1573
>                 URL: https://issues.apache.org/jira/browse/SPARK-1573
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>            Reporter: Nishkam Ravi
>
> When the sources are built against a certain Hadoop version with 
> SPARK_YARN=true, the same settings seem necessary when running sbt/sbt test. 
> For example:
> SPARK_HADOOP_VERSION=2.3.0-cdh5.0.0 SPARK_YARN=true sbt/sbt assembly
> SPARK_HADOOP_VERSION=2.3.0-cdh5.0.0 SPARK_YARN=true sbt/sbt test
> Otherwise build errors and failing tests are seen.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to