[ 
https://issues.apache.org/jira/browse/HIVE-8795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14208476#comment-14208476
 ] 

Szehon Ho commented on HIVE-8795:
---------------------------------

I dont think its scalable to install Spark on all the build machine because 
everytime Spark has any change, we have to recreate the the images every single 
time.

It might be possible to set SPARK_HOME to the assembly location (which is 
freshly downloaded to maven repo anyway, according to JIRA description?).  Do 
you know if that is the case?

> Switch precommit test from local to local-cluster [Spark Branch]
> ----------------------------------------------------------------
>
>                 Key: HIVE-8795
>                 URL: https://issues.apache.org/jira/browse/HIVE-8795
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Szehon Ho
>
>  It seems unlikely that Spark community will provide MRMiniCluster equivalent 
> (SPARK-3691), and Spark local-cluster was the recommendation. Latest research 
> shows that Spark local-cluster works with Hive. Therefore, for now, we use 
> Spark local-cluster (instead of current local) for our precommit test.
> It's previous belived (HIVE-7382) that a Spark installation is required and 
> SPARK_HOME env variable needs to set. Since Spark pulls in Spark's assembly 
> jar, it's believed now we only need a few script from Spark installation 
> instead.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to