[ 
https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15956879#comment-15956879
 ] 

Steve Loughran commented on SPARK-20202:
----------------------------------------

# the ugliness need to inset the spark thrift stuff under the hive thrift stuff 
is obsolete, can be cut entirely.
# with the shading of kryo not needed, an unshaded hive *may* work. I forget 
which troublespots there were last time, probably the usual suspects: jackson, 
guava, etc.
# Hive 1.2.x refuses to work with Hadoop 3; it considers that an unsupported 
version. For basic client-side testing, you can build Hadoop 3 with a fake 
version (e..g {{mvn install -DskipShade -Ddeclared.hadoop.version=2.11}}, but 
as hadoop version is something which NN/DNs care about, not something that's 
really going to work in real systems. Presumably later hive versions will 
address that.

If hive take over ownership of the spark 1.2.1-spark branch, this could be done 
first simply by pulling the spark branch into the Hive repo as a branch, 
defining the artifact naming properly and releasing it. If that is done, before 
any release of that 1.2.x branch is done, there's a couple of outstanding PRs 
to pull in (groovy version for security reasons, ... ).. A quick import & 
re-release would be the fast way to get this out as an asf-approved binary

> Remove references to org.spark-project.hive
> -------------------------------------------
>
>                 Key: SPARK-20202
>                 URL: https://issues.apache.org/jira/browse/SPARK-20202
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.1.1
>            Reporter: Owen O'Malley
>            Priority: Blocker
>
> Spark can't continue to depend on their fork of Hive and must move to 
> standard Hive versions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to