[ 
https://issues.apache.org/jira/browse/HIVE-12880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15105728#comment-15105728
 ] 

Sergey Shelukhin commented on HIVE-12880:
-----------------------------------------

It seems like the default spark-assembly built from Spark itself includes Hive.
This is what I'd expect most independent users will have...
If I am correct about this (not very familiar with spark build), I wonder if it 
makes sense to either (1) add new published jar to Spark that excludes this 
spurious Hive version, and use that (2) disable the assembly being added by 
default with this in mind? On a higher level, we don't add e.g. Tez jars unless 
they are added explicitly (and they don't even package Hive ;)).

> spark-assembly causes Hive class version problems
> -------------------------------------------------
>
>                 Key: HIVE-12880
>                 URL: https://issues.apache.org/jira/browse/HIVE-12880
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Hui Zheng
>
> It looks like spark-assembly contains versions of Hive classes (e.g. 
> HiveConf), and these sometimes (always?) come from older versions of Hive.
> We've seen problems where depending on classpath perturbations, NoSuchField 
> errors may be thrown for recently added ConfVars because the HiveConf class 
> comes from spark-assembly.
> Would making sure spark-assembly comes last in the classpath solve the 
> problem?
> Otherwise, can we depend on something that does not package older Hive 
> classes?
> Currently, HIVE-12179 provides a workaround (in non-Spark use case, at least; 
> I am assuming this issue can also affect Hive-on-Spark).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to