MLikeWater commented on issue #2855:
URL: 
https://github.com/apache/incubator-kyuubi/issues/2855#issuecomment-1152848166

   > I tried the following:
   > 
   > ```
   > # upload jars(hive,hadoop)
   > ozone fs -put *  ofs://cluster1/hivemetastore/jars/
   > 
   > # add config in kyuubi-defaults.conf 
   > spark.sql.hive.metastore.version 3.1.2
   > spark.sql.hive.metastore.jars path
   > spark.sql.hive.metastore.jars.path ofs://cluster1/hivemetastore/jars/*
   > 
   > # restart kyuubi server pod
   > # access kyuubi
   > kubectl exec --stdin --tty kyuubi-server-0   -n spark-operator -- /bin/bash
   >  ./bin/beeline -u jdbc:hive2://10.2.1.5:10011 -n meimei -pxxxx
   > ```
   > 
   > But, some errors, engine start failed :
   > 
   > ```
   > 22/06/11 11:41:33 DEBUG MutableMetricsFactory: field private 
org.apache.hadoop.metrics2.lib.MutableCounterLong 
org.apache.hadoop.hdds.scm.XceiverClientMetrics.pendingOps with annotation 
@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, 
valueName=Time, about=, interval=10, type=DEFAULT, value=[])
   > 22/06/11 11:41:33 DEBUG MutableMetricsFactory: field private 
org.apache.hadoop.metrics2.lib.MutableCounterLong 
org.apache.hadoop.hdds.scm.XceiverClientMetrics.totalOps with annotation 
@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, 
valueName=Time, about=, interval=10, type=DEFAULT, value=[])
   > 22/06/11 11:41:33 DEBUG IsolatedClientLoader: hive class: 
org.apache.hadoop.hive.conf.HiveConf - null
   > 22/06/11 11:41:33 ERROR SparkSQLEngine: Failed to instantiate 
SparkSession:  java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/conf/HiveConf when creating Hive client using classpath: 
ofs://cluster1/hivemetastore/jars/......
   > ...
   >  Please make sure that jars for your version of hive and hadoop are 
included in the paths passed to spark.sql.hive.metastore.jars.
   > ...
   > Caused by: java.lang.reflect.InvocationTargetException
   >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
   >         at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >         at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >         at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:310)
   >         ... 81 more
   > Caused by: java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/conf/HiveConf
   >         at 
org.apache.spark.sql.hive.client.HiveClientImpl$.newHiveConf(HiveClientImpl.scala:1245)
   >         at 
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:164)
   >         at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:129)
   >         ... 86 more
   > Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.hive.conf.HiveConf
   >         at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
   >         at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
   >         at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:264)
   >         at 
org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:253)
   >         at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
   >         ... 89 more
   > 22/06/11 11:41:33 INFO SparkUI: Stopped Spark web UI at http://xxxx:44307
   > 22/06/11 11:41:33 INFO KubernetesClusterSchedulerBackend: Shutting down 
all executors
   > 22/06/11 11:41:33 INFO 
KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Asking each 
executor to shut down
   > 22/06/11 11:41:33 DEBUG AbstractWatchManager: Force closing the watch 
io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager@197c3101
   > 22/06/11 11:41:33 WARN ExecutorPodsWatchSnapshotSource: Kubernetes client 
has been closed.
   > 22/06/11 11:41:33 DEBUG AbstractWatchManager: Closing websocket 
okhttp3.internal.ws.RealWebSocket@4d65fbad
   > 22/06/11 11:41:33 DEBUG AbstractWatchManager: Closing ExecutorService
   > 22/06/11 11:41:33 DEBUG WatcherWebSocketListener: Socket closing:
   > 22/06/11 11:41:33 DEBUG WatcherWebSocketListener: WebSocket close 
received. code: 1000, reason:
   > 22/06/11 11:41:33 DEBUG WatcherWebSocketListener: Ignoring onClose for 
already closed/closing websocket
   > 22/06/11 11:41:33 DEBUG FsUrlStreamHandlerFactory: Creating handler for 
protocol https
   > 22/06/11 11:41:33 DEBUG FsUrlStreamHandlerFactory: Unknown protocol https, 
delegating to default implementation
   > 22/06/11 11:41:34 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   > 22/06/11 11:41:34 INFO MemoryStore: MemoryStore cleared
   > 22/06/11 11:41:34 INFO BlockManager: BlockManager stopped
   > 22/06/11 11:41:34 INFO BlockManagerMaster: BlockManagerMaster stopped
   > 22/06/11 11:41:34 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   > 22/06/11 11:41:34 INFO SparkContext: Successfully stopped SparkContext
   > 22/06/11 11:41:36 DEBUG PoolThreadCache: Freed 4 thread-local buffer(s) 
from thread: rpc-server-4-1
   > ```
   
   I tried to put all jars of spark jars directory  into 
ofs://cluster1/hivemetastore/jars/*, but still `Caused by: 
java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf`, maybe 
spark.sql.hive.metastore.jars and spark.sql.hive.metastore.jars.path not take 
effect.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to