[
https://issues.apache.org/jira/browse/SPARK-1828?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14098752#comment-14098752
]
Patrick Wendell commented on SPARK-1828:
----------------------------------------
Maxim - I think what you are pointing out is unrated to this exact issue. Spark
hard-codes a specific version of Hive in our build. This is true whether or not
we are pointing to a slightly modified version of Hive 0.12 or the actual Hive
0.12.
The issue is that Hive does not have stable API's so we can't provide a version
of Spark that is cross-compatible with different versions of Hive. We are
trying to simplify our dependency on Hive to fix this.
Are you proposing a specific change here?
> Created forked version of hive-exec that doesn't bundle other dependencies
> --------------------------------------------------------------------------
>
> Key: SPARK-1828
> URL: https://issues.apache.org/jira/browse/SPARK-1828
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.0.0
> Reporter: Patrick Wendell
> Assignee: Patrick Wendell
> Priority: Blocker
> Fix For: 1.0.0
>
>
> The hive-exec jar includes a bunch of Hive's dependencies in addition to hive
> itself (protobuf, guava, etc). See HIVE-5733. This breaks any attempt in
> Spark to manage those dependencies.
> The only solution to this problem is to publish our own version of hive-exec
> 0.12.0 that behaves correctly. While we are doing this, we might as well
> re-write the protobuf dependency to use the shaded version of protobuf 2.4.1
> that we already have for Akka.
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]