[
https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15953298#comment-15953298
]
Owen O'Malley edited comment on SPARK-20202 at 4/3/17 11:16 AM:
----------------------------------------------------------------
As an Apache member, the Spark project can't release binary artifacts that
aren't made from its Apache code base. So either, the Spark project needs to
use Hive's release artifacts or it needs to formally fork Hive and move the
fork into its git repository at Apache and rename it away from org.apache.hive
to org.apache.spark. The current path is not allowed.
Hive is in the middle of rolling releases and thus this is a good time to make
requests. The old uber jar (hive-exec) is already released separately with the
classifier "core." It looks like we are using the same protobuf (2.5.0) and
kryo (3.0.3) versions.
was (Author: owen.omalley):
As an Apache member, the Spark project can't release binary artifacts that
aren't made from its Apache code base. So either, the Spark project needs to
use Hive's release artifacts or it formally fork Hive and move the fork into
its git repository at Apache and rename it away from org.apache.hive to
org.apache.spark. The current path is not allowed.
Hive is in the middle of rolling releases and thus this is a good time to make
requests. The old uber jar (hive-exec) is already released separately with the
classifier "core." It looks like we are using the same protobuf (2.5.0) and
kryo (3.0.3) versions.
> Remove references to org.spark-project.hive
> -------------------------------------------
>
> Key: SPARK-20202
> URL: https://issues.apache.org/jira/browse/SPARK-20202
> Project: Spark
> Issue Type: Bug
> Components: Build, SQL
> Affects Versions: 1.6.4, 2.0.3, 2.1.1
> Reporter: Owen O'Malley
> Priority: Critical
>
> Spark can't continue to depend on their fork of Hive and must move to
> standard Hive versions.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]