[ 
https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-20202:
------------------------------
         Priority: Critical  (was: Blocker)
    Fix Version/s:     (was: 2.1.1)
                       (was: 1.6.4)
                       (was: 2.0.3)

I see wide agreement on that. One question I have is, is including Hive this 
way merely a really-not-nice-to-have or actually not allowed? I think the 
question is whether sources are available, right? because releases can't have 
binary-only parts. I plead ignorance, I have never myself paid much attention 
to this integration. 

If it's not then this sounds like something has to change for releases beyond 
2.1.1 and this can be targeted as a Blocker accordingly.

Does this depend on refactoring or changes in Hive? IIRC the problem was 
hive-exec being an uber-jar, but it's been a long time since I read any of that 
discussion.

> Remove references to org.spark-project.hive
> -------------------------------------------
>
>                 Key: SPARK-20202
>                 URL: https://issues.apache.org/jira/browse/SPARK-20202
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.1.1
>            Reporter: Owen O'Malley
>            Priority: Critical
>
> Spark can't continue to depend on their fork of Hive and must move to 
> standard Hive versions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to