[ 
https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16540483#comment-16540483
 ] 

Hyukjin Kwon edited comment on SPARK-20202 at 7/11/18 6:24 PM:
---------------------------------------------------------------

[~rxin], there was an initial try above already though which at least made the 
regression tests we wrote so far passed. I talked with [~q79969786] before and 
she's willing to finish this. For this, I need more supports from you and other 
guys to go this way ..

I get your point too on the other hand. So, do you think we should rather not 
explicitly target it since it's pretty difficult and we should better let Hive 
publish 1.2.x first rather then keeping the fork since it's unclear if we make 
it in 3.0.0?


was (Author: hyukjin.kwon):
[~rxin], there was an initial try above already though which at least made the 
regression tests we wrote so far passed. I talked with [~q79969786] before and 
she's willing to finish this. For this, I need more supports from you and other 
guys to go this way ..

I get your point too on the other hand. So, do you we should rather not 
explicitly target it since it's pretty difficult and we should better let Hive 
publish 1.2.x first rather then keeping the fork since it's unclear if we make 
it in 3.0.0?

> Remove references to org.spark-project.hive
> -------------------------------------------
>
>                 Key: SPARK-20202
>                 URL: https://issues.apache.org/jira/browse/SPARK-20202
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.1.1
>            Reporter: Owen O'Malley
>            Priority: Major
>
> Spark can't continue to depend on their fork of Hive and must move to 
> standard Hive versions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to