[ 
https://issues.apache.org/jira/browse/SPARK-11342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14975893#comment-14975893
 ] 

Sean Owen commented on SPARK-11342:
-----------------------------------

Hm... maybe so. If so then I think this is more for running a default test 
profile. Why not build/test directly using Maven/SBT in the case where you need 
to test a specific set of flags? that is, this isn't the end of this 
requirement: what if you need to set a particular Hive version? in the end I 
think you want to run manually anyway.

> Allow to set hadoop profile when running dev/run_tests
> ------------------------------------------------------
>
>                 Key: SPARK-11342
>                 URL: https://issues.apache.org/jira/browse/SPARK-11342
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>            Reporter: Jeff Zhang
>            Priority: Minor
>
> Usually I will assembly spark with hadoop 2.6.0. But when I run 
> dev/run_tests, it would use hadoop-2.3. And when I run bin/spark-shell the 
> next time, it would complain that there're multiple of spark assembly jars. 
> It would be nice that I can specify hadoop profile when run dev/run_tests



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to