wanghangyu817 opened a new issue #1959:
URL: https://github.com/apache/incubator-kyuubi/issues/1959


   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/incubator-kyuubi/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Describe the bug
   
   run error
   
   ### Affects Version(s)
   
   1.4.1
   
   ### Kyuubi Server Log Output
   
   ```logtalk
   22/02/22 17:18:03 INFO operation.GetTables: Processing anonymous's 
query[5f817196-238c-4a71-b8c5-4bb432d1cb3f]: INITIALIZED_STATE -> 
RUNNING_STATE, statement: GET_TABLES
   22/02/22 17:18:04 INFO operation.GetTables: Processing anonymous's 
query[5f817196-238c-4a71-b8c5-4bb432d1cb3f]: RUNNING_STATE -> ERROR_STATE, 
statement: GET_TABLES, time taken: 0.902 seconds
   22/02/22 17:18:04 INFO operation.GetTables: Processing anonymous's 
query[5f817196-238c-4a71-b8c5-4bb432d1cb3f]: ERROR_STATE -> CLOSED_STATE, 
statement: GET_TABLES
   22/02/22 17:18:04 ERROR server.KyuubiThriftBinaryFrontendService: Error 
getting tables: 
   org.apache.kyuubi.KyuubiSQLException: Error operating GET_TABLES: 
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: 
Error in loading storage 
handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler
           at 
org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:297)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.convertHiveTableToCatalogTable(HiveClientImpl.scala:648)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$getTablesByName$2(HiveClientImpl.scala:414)
           at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
           at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at scala.collection.TraversableLike.map(TraversableLike.scala:238)
           at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
           at scala.collection.AbstractTraversable.map(Traversable.scala:108)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$getTablesByName$1(HiveClientImpl.scala:414)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:293)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:226)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:225)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:275)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.getTablesByName(HiveClientImpl.scala:414)
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.getRawTablesByNames(HiveExternalCatalog.scala:127)
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$getTablesByName$1(HiveExternalCatalog.scala:726)
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.getTablesByName(HiveExternalCatalog.scala:726)
           at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.getTablesByName(ExternalCatalogWithListener.scala:142)
           at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTablesByName(SessionCatalog.scala:514)
           at 
org.apache.kyuubi.engine.spark.shim.CatalogShim_v2_4.$anonfun$getCatalogTablesOrViews$1(CatalogShim_v2_4.scala:59)
           at 
scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.IterableLike.foreach(IterableLike.scala:74)
           at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
           at 
scala.collection.TraversableLike.flatMap(TraversableLike.scala:245)
           at 
scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242)
           at 
scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
           at 
org.apache.kyuubi.engine.spark.shim.CatalogShim_v2_4.getCatalogTablesOrViews(CatalogShim_v2_4.scala:57)
           at 
org.apache.kyuubi.engine.spark.shim.CatalogShim_v3_0.getCatalogTablesOrViews(CatalogShim_v3_0.scala:140)
           at 
org.apache.kyuubi.engine.spark.operation.GetTables.runInternal(GetTables.scala:74)
           at 
org.apache.kyuubi.operation.AbstractOperation.run(AbstractOperation.scala:130)
           at 
org.apache.kyuubi.session.AbstractSession.runOperation(AbstractSession.scala:93)
           at 
org.apache.kyuubi.engine.spark.session.SparkSessionImpl.runOperation(SparkSessionImpl.scala:45)
           at 
org.apache.kyuubi.session.AbstractSession.getTables(AbstractSession.scala:152)
           at 
org.apache.kyuubi.service.AbstractBackendService.getTables(AbstractBackendService.scala:89)
           at 
org.apache.kyuubi.service.ThriftBinaryFrontendService.GetTables(ThriftBinaryFrontendService.scala:327)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetTables.getResult(TCLIService.java:1637)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetTables.getResult(TCLIService.java:1622)
           at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
           at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
           at 
org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error in 
loading storage handler.org.apache.hadoop.hive.hbase.HBaseStorageHandler
           at 
org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:310)
           at 
org.apache.hadoop.hive.ql.metadata.Table.getStorageHandler(Table.java:292)
           ... 50 more
   Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.hive.hbase.HBaseStorageHandler
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           at java.lang.Class.forName0(Native Method)
           at java.lang.Class.forName(Class.java:348)
           at 
org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:305)
           ... 51 more
   
           at 
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:102)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:85)
           at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
           at 
org.apache.kyuubi.engine.spark.operation.GetTables.runInternal(GetTables.scala:85)
           at 
org.apache.kyuubi.operation.AbstractOperation.run(AbstractOperation.scala:130)
           at 
org.apache.kyuubi.session.AbstractSession.runOperation(AbstractSession.scala:93)
           at 
org.apache.kyuubi.engine.spark.session.SparkSessionImpl.runOperation(SparkSessionImpl.scala:45)
           at 
org.apache.kyuubi.session.AbstractSession.getTables(AbstractSession.scala:152)
           at 
org.apache.kyuubi.service.AbstractBackendService.getTables(AbstractBackendService.scala:89)
           at 
org.apache.kyuubi.service.ThriftBinaryFrontendService.GetTables(ThriftBinaryFrontendService.scala:327)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetTables.getResult(TCLIService.java:1637)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetTables.getResult(TCLIService.java:1622)
           at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
           at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
           at 
org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   ```
   
   
   ### Kyuubi Engine Log Output
   
   _No response_
   
   ### Kyuubi Server Configurations
   
   ```yaml
   no
   ```
   
   
   ### Kyuubi Engine Configurations
   
   _No response_
   
   ### Additional context
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to