[
https://issues.apache.org/jira/browse/SPARK-1828?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14098774#comment-14098774
]
Maxim Ivanov commented on SPARK-1828:
-------------------------------------
I don't have a pull request at hand if you are askin that ;) But IMHO proper
solution is to tinker with maven shade plugin, to drop classes pulled by hive
dependency in favor of those specified in Spark POM.
If it is done that way, then it would be possible to specify hive version using
"-D" param in the same way we can specify hadoop version and be sure (to some
extent of course :) ) that if it builds,it works.
> Created forked version of hive-exec that doesn't bundle other dependencies
> --------------------------------------------------------------------------
>
> Key: SPARK-1828
> URL: https://issues.apache.org/jira/browse/SPARK-1828
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.0.0
> Reporter: Patrick Wendell
> Assignee: Patrick Wendell
> Priority: Blocker
> Fix For: 1.0.0
>
>
> The hive-exec jar includes a bunch of Hive's dependencies in addition to hive
> itself (protobuf, guava, etc). See HIVE-5733. This breaks any attempt in
> Spark to manage those dependencies.
> The only solution to this problem is to publish our own version of hive-exec
> 0.12.0 that behaves correctly. While we are doing this, we might as well
> re-write the protobuf dependency to use the shaded version of protobuf 2.4.1
> that we already have for Akka.
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]