[
https://issues.apache.org/jira/browse/SPARK-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15955315#comment-15955315
]
Sunil Rangwani edited comment on SPARK-14492 at 4/4/17 4:11 PM:
----------------------------------------------------------------
{quote} How would updating your external Hive metastore affect the code that
Spark uses at runtime? {quote}
It can't. It uses the jars specified in spark.sql.hive.metastore.jars property
{quote}This isn't even consistent with the title of this JIRA.{quote}
I have changed the title to make it a better reflection of the actual problem.
was (Author: sunil.rangwani):
{quote} How would updating your external Hive metastore affect the code that
Spark uses at runtime? {quote}
It can't. It uses the jars specified in spark.sql.hive.metastore.jars property
{quote}This isn't even consistent with the title of this JIRA.{quote}
I will change the title to make it a better reflection of the actual problem.
> Spark SQL 1.6.0 does not work with external Hive metastore version lower than
> 1.2.0; its not backwards compatible with earlier version
> --------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-14492
> URL: https://issues.apache.org/jira/browse/SPARK-14492
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Sunil Rangwani
> Priority: Critical
>
> Spark SQL when configured with a Hive version lower than 1.2.0 throws a
> java.lang.NoSuchFieldError for the field METASTORE_CLIENT_SOCKET_LIFETIME
> because this field was introduced in Hive 1.2.0 so its not possible to use
> Hive metastore version lower than 1.2.0 with Spark. The details of the Hive
> changes can be found here: https://issues.apache.org/jira/browse/HIVE-9508
> {code:java}
> Exception in thread "main" java.lang.NoSuchFieldError:
> METASTORE_CLIENT_SOCKET_LIFETIME
> at
> org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:500)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:250)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
> at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
> at
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
> at
> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
> at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:267)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:139)
> at
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]