[ 
https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16521203#comment-16521203
 ] 

Hyukjin Kwon commented on SPARK-20202:
--------------------------------------

Hi all, what do you guys think about eplacing it to Hive 2.3.x in the near 
future (like Spark 3.0.0) given SPARK-23710, and keeping the fork for now?

Looks [~q79969786] completed the initial try at SPARK-23710 and now it sounds 
pretty much feasible as an option now although it sounds there are still some 
investigations; however, I believe that we can focus on getting through if we 
have the explicit plan here. I think we are mostly all positive on this option 
as a final goal anyway but I felt like we need to make sure on this.
If the above can be set as the goal for this JIRA to get rid of the fork 
completely, \*I personally think\* Hive side also can focus on landing other 
fixes to the more resent versions without diverting the efforts to maintain an 
old branch.

Until then, I think we could probably consider keeping the fork for now and 
landing some minor fixes if there're some strong reasons for it. For example, 
Hadoop 3 support is blocked by one liner fix in the fork. \*I personally 
think\* it is the easiest way to land this fix into the fork. I believe this is 
pretty reasonable.

What do you guys think about this?



> Remove references to org.spark-project.hive
> -------------------------------------------
>
>                 Key: SPARK-20202
>                 URL: https://issues.apache.org/jira/browse/SPARK-20202
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.1.1
>            Reporter: Owen O'Malley
>            Priority: Major
>
> Spark can't continue to depend on their fork of Hive and must move to 
> standard Hive versions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to