[ https://issues.apache.org/jira/browse/SPARK-28910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang resolved SPARK-28910. --------------------------------- Resolution: Fixed > Prevent schema verification when connecting to in memory derby > -------------------------------------------------------------- > > Key: SPARK-28910 > URL: https://issues.apache.org/jira/browse/SPARK-28910 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.4.3 > Reporter: Juliusz Sompolski > Assignee: Bogdan Ghit > Priority: Major > Fix For: 3.0.0 > > > When {{hive.metastore.schema.verification=true}}, > {{HiveUtils.newClientForExecution}} fails with > {code} > 19/08/14 13:26:55 WARN Hive: Failed to access metastore. This class should > not accessed in runtime. > org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: > Unable to instantiate > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient > at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236) > at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) > at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) > at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) > at > org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:186) > at > org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:143) > at > org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:290) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForExecution(HiveUtils.scala:275) > at > org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.startWithContext(HiveThriftServer2.scala:58) > ... > Caused by: java.lang.RuntimeException: Unable to instantiate > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient > {code} > This prevents Thriftserver from starting -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org