MLikeWater commented on issue #2855:
URL: 
https://github.com/apache/incubator-kyuubi/issues/2855#issuecomment-1152817486

   I tried the following:
   ```
   # upload jars
   cd apache-hive-metastore-3.1.2-bin/lib
   ozone fs -put *  ofs://cluster1/hivemetastore/jars/
   
   # add config in kyuubi-defaults.conf 
   spark.sql.hive.metastore.version 3.1.2
   spark.sql.hive.metastore.jars path
   spark.sql.hive.metastore.jars.path ofs://cluster1/hivemetastore/*
   
   # restart kyuubi server pod
   # access kyuubi
   kubectl exec --stdin --tty kyuubi-server-0   -n spark-operator -- /bin/bash
    ./bin/beeline -u jdbc:hive2://10.2.1.5:10011 -n meimei -pxxxx
   ```
   But, some errors, engine start failed :
   ```
   22/06/11 08:35:31 DEBUG ProtobufRpcEngine: Call: submitRequest took 1ms
   22/06/11 08:35:31 DEBUG IsolatedClientLoader: hive class: 
org.apache.hadoop.hive.conf.HiveConf - null
   22/06/11 08:35:31 ERROR SparkSQLEngine: Failed to instantiate SparkSession:  
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf when 
creating Hive client using classpath: ofs://cluster1/hivemetastore/jars, 
ofs://cluster1/hivemetastore/php  Please make sure that jars for your version 
of hive and hadoop are included in the paths passed to 
spark.sql.hive.metastore.jars.        
   java.lang.ClassNotFoundException:  java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/conf/HiveConf when creating Hive client using classpath: 
ofs://cluster1/hivemetastore/jars, ofs://cluster1/hivemetastore/php  Please 
make sure that jars for your version of hive and hadoop are included in the 
paths passed to spark.sql.hive.metastore.jars.        
        at 
org.apache.spark.sql.errors.QueryExecutionErrors$.loadHiveClientCausesNoClassDefFoundError(QueryExecutionErrors.scala:1338)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:317)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:496)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:356)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:71)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:70)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:224)
        at 
scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:224)
        at 
org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:150)
        at 
org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:140)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:45)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:60)
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:118)
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:118)
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:298)
        at 
org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.listNamespaces(V2SessionCatalog.scala:205)
        at 
org.apache.spark.sql.execution.datasources.v2.ShowNamespacesExec.run(ShowNamespacesExec.scala:42)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowObjectsExec.org$apache$kyuubi$plugin$spark$authz$ranger$FilteredShowObjectsExec$$result(FilteredShowObjectsExec.scala:34)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowObjectsExec.org$apache$kyuubi$plugin$spark$authz$ranger$FilteredShowObjectsExec$$result$(FilteredShowObjectsExec.scala:33)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowNamespaceExec.org$apache$kyuubi$plugin$spark$authz$ranger$FilteredShowObjectsExec$$result$lzycompute(FilteredShowObjectsExec.scala:44)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowNamespaceExec.org$apache$kyuubi$plugin$spark$authz$ranger$FilteredShowObjectsExec$$result(FilteredShowObjectsExec.scala:44)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowObjectsExec.doExecute(FilteredShowObjectsExec.scala:38)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowObjectsExec.doExecute$(FilteredShowObjectsExec.scala:37)
        at 
org.apache.kyuubi.plugin.spark.authz.ranger.FilteredShowNamespaceExec.doExecute(FilteredShowObjectsExec.scala:44)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:184)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:222)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:219)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:180)
        at 
org.apache.spark.sql.execution.InputAdapter.inputRDD(WholeStageCodegenExec.scala:526)
        at 
org.apache.spark.sql.execution.InputRDDCodegen.inputRDDs(WholeStageCodegenExec.scala:454)
        at 
org.apache.spark.sql.execution.InputRDDCodegen.inputRDDs$(WholeStageCodegenExec.scala:453)
        at 
org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:497)
        at 
org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:50)
        at 
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:750)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:184)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:222)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:219)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:180)
        at 
org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:325)
        at 
org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:443)
        at 
org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:429)
        at org.apache.spark.sql.Dataset.$anonfun$isEmpty$1(Dataset.scala:604)
        at 
org.apache.spark.sql.Dataset.$anonfun$isEmpty$1$adapted(Dataset.scala:603)
        at 
org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3706)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
        at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3704)
        at org.apache.spark.sql.Dataset.isEmpty(Dataset.scala:603)
        at 
org.apache.kyuubi.engine.spark.SparkSQLEngine$.$anonfun$createSpark$1(SparkSQLEngine.scala:193)
        at 
org.apache.kyuubi.engine.spark.SparkSQLEngine$.$anonfun$createSpark$1$adapted(SparkSQLEngine.scala:187)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at 
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:187)
        at 
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:278)
        at 
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:165)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:163)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:310)
        ... 81 more
   Caused by: java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/conf/HiveConf
        at 
org.apache.spark.sql.hive.client.HiveClientImpl$.newHiveConf(HiveClientImpl.scala:1245)
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:164)
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:129)
        ... 86 more
   Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:264)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:253)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        ... 89 more
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to