[ https://issues.apache.org/jira/browse/SPARK-20202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15957811#comment-15957811 ]
Reynold Xin commented on SPARK-20202: ------------------------------------- Yes this is really important. The proper way to do this is to publish a proper version of Hive with the right dependency declared (rather than including all the dependencies in a uber jar). Looks like there are broad support to do this. I'm going to create a JIRA ticket on Hive and add a dependency on this. This ticket will depend on that. > Remove references to org.spark-project.hive > ------------------------------------------- > > Key: SPARK-20202 > URL: https://issues.apache.org/jira/browse/SPARK-20202 > Project: Spark > Issue Type: Bug > Components: Build, SQL > Affects Versions: 1.6.4, 2.0.3, 2.1.1 > Reporter: Owen O'Malley > > Spark can't continue to depend on their fork of Hive and must move to > standard Hive versions. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org