Ngone51 commented on issue #27492: [SPARK-30755][SQL][test-hive1.2] Support Hive 1.2.1's Serde after making built-in Hive to 2.3 URL: https://github.com/apache/spark/pull/27492#issuecomment-583933078 Hi @wangyum , I got this error when I run `SPARK-30755: Support Hive 1.2.1's Serde after making built-in Hive to 2.3` in `HiveQuerySuite` locally using `./build/sbt -Phadoop-2.7 -Phive-2.3 -Phive-thriftserver -Phive`: ``` [info] - SPARK-30755: Support Hive 1.2.1's Serde after making built-in Hive to 2.3 *** FAILED *** (5 seconds, 875 milliseconds) [info] org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: class org.apache.spark.sql.hive.DummyHiveSerde; [info] at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:109) [info] at org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:242) [info] at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94) [info] at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:325) [info] at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:163) [info] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) [info] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) [info] at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79) [info] at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:226) [info] at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3497) [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:102) [info] at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:162) [info] at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:89) [info] at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3495) [info] at org.apache.spark.sql.Dataset.<init>(Dataset.scala:226) [info] at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:88) [info] at org.apache.spark.sql.hive.test.TestHiveSparkSession.sql(TestHive.scala:238) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.$anonfun$new$114(HiveQuerySuite.scala:1231) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377) [info] at org.apache.spark.sql.test.SQLTestUtilsBase.withTable(SQLTestUtils.scala:290) [info] at org.apache.spark.sql.test.SQLTestUtilsBase.withTable$(SQLTestUtils.scala:288) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.withTable(HiveQuerySuite.scala:49) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.$anonfun$new$113(HiveQuerySuite.scala:1229) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) [info] at org.scalatest.Transformer.apply(Transformer.scala:22) [info] at org.scalatest.Transformer.apply(Transformer.scala:20) [info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186) [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:151) [info] at org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184) [info] at org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196) [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:286) [info] at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196) [info] at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:58) [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221) [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.org$scalatest$BeforeAndAfter$$super$runTest(HiveQuerySuite.scala:49) [info] at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:203) [info] at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:192) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.runTest(HiveQuerySuite.scala:49) [info] at org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229) [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:393) [info] at scala.collection.immutable.List.foreach(List.scala:392) [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:381) [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:376) [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:458) [info] at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229) [info] at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228) [info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560) [info] at org.scalatest.Suite.run(Suite.scala:1124) [info] at org.scalatest.Suite.run$(Suite.scala:1106) [info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560) [info] at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233) [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:518) [info] at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233) [info] at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232) [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:58) [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.org$scalatest$BeforeAndAfter$$super$run(HiveQuerySuite.scala:49) [info] at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:258) [info] at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:256) [info] at org.apache.spark.sql.hive.execution.HiveQuerySuite.run(HiveQuerySuite.scala:49) [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:317) [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:510) [info] at sbt.ForkMain$Run$2.call(ForkMain.java:296) [info] at sbt.ForkMain$Run$2.call(ForkMain.java:286) [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [info] at java.lang.Thread.run(Thread.java:748) ... ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
