jinkachy opened a new issue, #5317:
URL: https://github.com/apache/kyuubi/issues/5317

   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/kyuubi/issues?q=is%3Aissue) and found no 
similar issues.
   
   
   ### Describe the bug
   
   cant read data from avro format hive table by 
kyuubi-spark-connector-hive_2.12-1.8.0.1.jar and 
kyuubi-spark-connector-hive_2.12-1.8.0.4.jar
   
    code is
   ```
        val catalog3 = "oppo"
        val oppoConfigs = Map[String, String](
         "spark.sql.catalogImplementation" -> "hive",
         s"spark.sql.catalog.$catalog3" -> 
"org.apache.kyuubi.spark.connector.hive.HiveTableCatalog",
         s"spark.sql.catalog.$catalog3.hive.metastore.uris" ->
           "thrift://*******:9083",
         s"spark.sql.catalog.$catalog3.hive.metastore.sasl.enabled" -> "false",
         s"spark.sql.catalog.$catalog3.hive.exec.dynamic.partition.mode" -> 
"nonstrict",
         s"spark.sql.catalog.$catalog3.dfs.nameservices" -> "nameservice1",
         s"spark.sql.catalog.$catalog3.dfs.ha.namenodes.nameservice1" -> 
"namenode63,namenode49",
         
s"spark.sql.catalog.$catalog3.dfs.namenode.rpc-address.nameservice1.namenode63" 
->
           "*******:8020",
         
s"spark.sql.catalog.$catalog3.dfs.namenode.rpc-address.nameservice1.namenode49" 
->
           "*****8020",
         //      s"spark.sql.catalog.$catalog3.hadoop.security.authentication" 
-> "simple",
         //      s"spark.sql.catalog.$catalog3.hadoop.security.authorization" 
-> "false",
         
s"spark.sql.catalog.$catalog3.dfs.client.failover.proxy.provider.nameservice1" 
->
           
"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
         
s"spark.sql.catalog.$catalog3.ipc.client.fallback-to-simple-auth-allowed" -> 
"true",
         s"spark.hadoop.ipc.client.fallback-to-simple-auth-allowed" -> "true",
         "" -> "")
       val sparkConf = new SparkConf()
       sparkConf.setAll(oppoConfigs)
       val sparkSession = 
SparkSession.builder().master("local").config(sparkConf).getOrCreate()
       val dataFrame = sparkSession.sql(s"select * from 
$catalog3.data_transform.t_avro_2022")
       dataFrame.show()
       sparkSession.stop()
   ```
   
   hive create sql is 
   ```
   CREATE TABLE `data_transform`.`t_avro_2022`(
   `id` bigint COMMENT '',
   `name` string COMMENT '',
   `age` int COMMENT '',
   `remark` string COMMENT '') 
   COMMENT '' 
   PARTITIONED BY (`dt` string COMMENT '',`pt` string COMMENT '')
   ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
   WITH SERDEPROPERTIES('serialization.format' = '1')
   STORED AS INPUTFORMAT 
'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
   OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat';
   ```
   
   
   
   
   
   
   
   
   
   
   
   error msg is
   ```
   23/09/20 18:55:32 INFO DAGScheduler: ResultStage 0 (show at 
KyuubiConnectorSuite.scala:84) failed in 0.080 s due to Job aborted due to 
stage failure: Task not serializable: java.io.NotSerializableException: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe
   Serialization stack:
        - object not serializable (class: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe, value: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe@8548569)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@34759ac7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2c83e234))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[1] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@66cb61c2)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@581820a7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@66cb61c2))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[2] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@2f1a8bca)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@2d4604fa)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2f1a8bca))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[3] at show at KyuubiConnectorSuite.scala:84)
        - field (class: scala.Tuple2, name: _1, type: class java.lang.Object)
        - object (class scala.Tuple2, (MapPartitionsRDD[3] at show at 
KyuubiConnectorSuite.scala:84,org.apache.spark.SparkContext$$Lambda$2362/121483686@6fb2b972))
   23/09/20 18:55:32 INFO DAGScheduler: Job 0 failed: show at 
KyuubiConnectorSuite.scala:84, took 0.107010 s
   
   
   Job aborted due to stage failure: Task not serializable: 
java.io.NotSerializableException: org.apache.hadoop.hive.serde2.avro.AvroSerDe
   Serialization stack:
        - object not serializable (class: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe, value: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe@8548569)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@34759ac7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2c83e234))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[1] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@66cb61c2)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@581820a7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@66cb61c2))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[2] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@2f1a8bca)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@2d4604fa)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2f1a8bca))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[3] at show at KyuubiConnectorSuite.scala:84)
        - field (class: scala.Tuple2, name: _1, type: class java.lang.Object)
        - object (class scala.Tuple2, (MapPartitionsRDD[3] at show at 
KyuubiConnectorSuite.scala:84,org.apache.spark.SparkContext$$Lambda$2362/121483686@6fb2b972))
   org.apache.spark.SparkException: Job aborted due to stage failure: Task not 
serializable: java.io.NotSerializableException: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe
   Serialization stack:
        - object not serializable (class: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe, value: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe@8548569)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@34759ac7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2c83e234))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[1] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@66cb61c2)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@581820a7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@66cb61c2))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[2] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@2f1a8bca)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@2d4604fa)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2f1a8bca))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[3] at show at KyuubiConnectorSuite.scala:84)
        - field (class: scala.Tuple2, name: _1, type: class java.lang.Object)
        - object (class scala.Tuple2, (MapPartitionsRDD[3] at show at 
KyuubiConnectorSuite.scala:84,org.apache.spark.SparkContext$$Lambda$2362/121483686@6fb2b972))
        at 
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2669)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2605)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2604)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2604)
        at 
org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1513)
        at 
org.apache.spark.scheduler.DAGScheduler.submitStage(DAGScheduler.scala:1325)
        at 
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1267)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2815)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2807)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2796)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:952)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2258)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2298)
        at 
org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:506)
        at 
org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:459)
        at 
org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:48)
        at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3874)
        at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2869)
        at 
org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3864)
        at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:512)
        at 
org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3862)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
        at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3862)
        at org.apache.spark.sql.Dataset.head(Dataset.scala:2869)
        at org.apache.spark.sql.Dataset.take(Dataset.scala:3090)
        at org.apache.spark.sql.Dataset.getRows(Dataset.scala:294)
        at org.apache.spark.sql.Dataset.showString(Dataset.scala:333)
        at org.apache.spark.sql.Dataset.show(Dataset.scala:814)
        at org.apache.spark.sql.Dataset.show(Dataset.scala:773)
        at org.apache.spark.sql.Dataset.show(Dataset.scala:782)
        at 
com.netease.music.da.transfer.hive.KyuubiConnectorSuite.$anonfun$new$3(KyuubiConnectorSuite.scala:84)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
        at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
        at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
        at 
org.scalatest.funsuite.AnyFunSuite.withFixture(AnyFunSuite.scala:1563)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
        at 
com.netease.music.da.transfer.hive.BaseSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(BaseSuite.scala:11)
        at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
        at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
        at 
com.netease.music.da.transfer.hive.BaseSuite.runTest(BaseSuite.scala:11)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
        at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
        at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
        at org.scalatest.Suite.run(Suite.scala:1112)
        at org.scalatest.Suite.run$(Suite.scala:1094)
        at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
        at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
        at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
        at 
com.netease.music.da.transfer.hive.BaseSuite.org$scalatest$BeforeAndAfterAll$$super$run(BaseSuite.scala:11)
        at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
        at com.netease.music.da.transfer.hive.BaseSuite.run(BaseSuite.scala:11)
        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
        at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
        at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
        at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
        at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
        at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
        at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
        at org.scalatest.tools.Runner$.run(Runner.scala:798)
        at org.scalatest.tools.Runner.run(Runner.scala)
        at 
org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38)
        at 
org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25)
   Caused by: java.io.NotSerializableException: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe
   Serialization stack:
        - object not serializable (class: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe, value: 
org.apache.hadoop.hive.serde2.avro.AvroSerDe@8548569)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@34759ac7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2c83e234))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[1] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@66cb61c2)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@581820a7)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@66cb61c2))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[2] at show at KyuubiConnectorSuite.scala:84)
        - field (class: org.apache.spark.NarrowDependency, name: _rdd, type: 
class org.apache.spark.rdd.RDD)
        - object (class org.apache.spark.OneToOneDependency, 
org.apache.spark.OneToOneDependency@2f1a8bca)
        - writeObject data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.List$SerializationProxy, 
scala.collection.immutable.List$SerializationProxy@2d4604fa)
        - writeReplace data (class: 
scala.collection.immutable.List$SerializationProxy)
        - object (class scala.collection.immutable.$colon$colon, 
List(org.apache.spark.OneToOneDependency@2f1a8bca))
        - field (class: org.apache.spark.rdd.RDD, name: dependencies_, type: 
interface scala.collection.Seq)
        - object (class org.apache.spark.rdd.MapPartitionsRDD, 
MapPartitionsRDD[3] at show at KyuubiConnectorSuite.scala:84)
        - field (class: scala.Tuple2, name: _1, type: class java.lang.Object)
        - object (class scala.Tuple2, (MapPartitionsRDD[3] at show at 
KyuubiConnectorSuite.scala:84,org.apache.spark.SparkContext$$Lambda$2362/121483686@6fb2b972))
        at 
org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:41)
        at 
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:49)
        at 
org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:115)
        at 
org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1499)
        at 
org.apache.spark.scheduler.DAGScheduler.submitStage(DAGScheduler.scala:1325)
        at 
org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1267)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2815)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2807)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2796)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
   ```
   
   ### Affects Version(s)
   
   1.8.0.1
   
   ### Kyuubi Server Log Output
   
   ```logtalk
   no log, just use kyuubi-spark-connector-hive.jar
   ```
   
   
   ### Kyuubi Engine Log Output
   
   ```logtalk
   just use kyuubi-spark-connector-hive.jar
   ```
   
   
   ### Kyuubi Server Configurations
   
   ```yaml
   just use kyuubi-spark-connector-hive.jar
   ```
   
   
   ### Kyuubi Engine Configurations
   
   ```yaml
   just use kyuubi-spark-connector-hive.jar
   ```
   
   
   ### Additional context
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes. I would be willing to submit a PR with guidance from the Kyuubi 
community to fix.
   - [X] No. I cannot submit a PR at this time.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to