[
https://issues.apache.org/jira/browse/SPARK-13446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15369066#comment-15369066
]
Andriy Kushnir commented on SPARK-13446:
----------------------------------------
Any plans for this issue?
> Spark need to support reading data from Hive 2.0.0 metastore
> ------------------------------------------------------------
>
> Key: SPARK-13446
> URL: https://issues.apache.org/jira/browse/SPARK-13446
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Lifeng Wang
>
> Spark provided HIveContext class to read data from hive metastore directly.
> While it only supports hive 1.2.1 version and older. Since hive 2.0.0 has
> released, it's better to upgrade to support Hive 2.0.0.
> {noformat}
> 16/02/23 02:35:02 INFO metastore: Trying to connect to metastore with URI
> thrift://hsw-node13:9083
> 16/02/23 02:35:02 INFO metastore: Opened a connection to metastore, current
> connections: 1
> 16/02/23 02:35:02 INFO metastore: Connected to metastore.
> Exception in thread "main" java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
> at
> org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:473)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:192)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
> at
> org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:422)
> at
> org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:422)
> at
> org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:421)
> at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:72)
> at org.apache.spark.sql.SQLContext.table(SQLContext.scala:739)
> at org.apache.spark.sql.SQLContext.table(SQLContext.scala:735)
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]