GitHub user srowen opened a pull request:

    https://github.com/apache/spark/pull/744

    SPARK-1802. Audit dependency graph when Spark is built with -Phive

    This initial commit resolves the conflicts in the Hive profiles as noted in 
https://issues.apache.org/jira/browse/SPARK-1802 . 
    
    Most of the fix was to note that Hive drags in Avro, and so if the hive 
module depends on Spark's version of the `avro-*` dependencies, it will pull in 
our exclusions as needed too. But I found we need to copy some exclusions 
between the two Avro dependencies to get this right. And then had to squash 
some commons-logging intrusions.
    
    This turned up another annoying find, that `hive-exec` is basically an 
"assembly" artifact that _also_ packages all of its transitive dependencies. 
This means the final assembly shows lots of collisions between itself and its 
dependencies, and even other project dependencies. I have a TODO to examine 
whether that is going to be a deal-breaker or not.
    
    In the meantime I'm going to tack on a second commit to this PR that will 
also fix some similar, last collisions in the YARN profile.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/srowen/spark SPARK-1802

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/744.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #744
    
----
commit a856604cfc67cb58146ada01fda6dbbb2515fa00
Author: Sean Owen <[email protected]>
Date:   2014-05-12T10:08:21Z

    Resolve JAR version conflicts specific to Hive profile

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to