Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2241#issuecomment-58937829
  
    Okay then how about this to keep it simple:
    
    1. Add the `parquet-hive-bundle` dependency inside of `hive/pom.xml` inside 
of a block for the `hive-0.12-0` profile only.
    2. Change the default values of `hive.version`, `hive.short.version`, etc. 
to 0.13.1
    3. Update the instructions as follows:
    
    ```
    # Apache Hadoop 2.4.X with Hive 13 support
    mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean 
package
    
    # Apache Hadoop 2.4.X with Hive 12 support
     mvn -Pyarn -Phive-0.12.0 -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
-DskipTests clean package
    ```
    
    The issue is that we can't rely on dynamic profile activation because it 
messes up people linking against the `spark-hive` artifact. Many build tools 
like SBT do not support loading profiles in poms from other projects.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to