loudongfeng commented on PR #11423:
URL: 
https://github.com/apache/incubator-gluten/pull/11423#issuecomment-3757865282

   Seems there are session shared issues when enable Hive Suites. Local run can 
not repoduce.So stop adding ORC related hive suites until  gluten-ut support it.
   ```
   09:43:16.531 ScalaTest-main-running-DiscoverySuite WARN SparkSession: An 
existing Spark session exists as the active or default session.
   This probably means another suite leaked it. Attempting to stop it before 
continuing.
   This existing Spark session was created at:
   
   
org.apache.spark.sql.hive.orc.GlutenHiveOrcHadoopFsRelationSuite.<init>(GlutenOrcHadoopFsRelationSuite.scala:24)
   
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
   
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
   
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
   java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
   
java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
   java.base/java.lang.Class.newInstance(Class.java:645)
   org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:66)
   
org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38)
   scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
   scala.collection.Iterator.foreach(Iterator.scala:943)
   scala.collection.Iterator.foreach$(Iterator.scala:943)
   scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   scala.collection.IterableLike.foreach(IterableLike.scala:74)
   scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   scala.collection.TraversableLike.map(TraversableLike.scala:286)
   scala.collection.TraversableLike.map$(TraversableLike.scala:279)
   scala.collection.AbstractTraversable.map(Traversable.scala:108)
   ```
   
   ```
   java.util.NoSuchElementException: None.get
         at scala.None$.get(Option.scala:529)
         at scala.None$.get(Option.scala:527)
         at 
org.apache.spark.sql.execution.datasources.BasicWriteJobStatsTracker$.metrics(BasicWriteStatsTracker.scala:239)
         at 
org.apache.spark.sql.execution.command.DataWritingCommand.metrics(DataWritingCommand.scala:55)
         at 
org.apache.spark.sql.execution.command.DataWritingCommand.metrics$(DataWritingCommand.scala:55)
         at 
org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.metrics$lzycompute(InsertIntoHadoopFsRelationCommand.scala:47)
         at 
org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.metrics(InsertIntoHadoopFsRelationCommand.scala:47)
         at 
org.apache.spark.sql.execution.command.DataWritingCommandExec.metrics$lzycompute(commands.scala:109)
         at 
org.apache.spark.sql.execution.command.DataWritingCommandExec.metrics(commands.scala:109)
         at 
org.apache.spark.sql.execution.SparkPlanInfo$.fromSparkPlan(SparkPlanInfo.scala:63)
         at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:120)
         at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
         at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
         at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
         at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
         at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)
         at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
         at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)
         at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)
         at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461)
         at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
         at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
         at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
         at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
         at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
         at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437)
         at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98)
         at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85)
         at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83)
         at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:142)
         at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:869)
         at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:391)
         at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:364)
         at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:243)
         at 
org.apache.spark.sql.execution.datasources.FileBasedDataSourceTest.$anonfun$withDataSourceFile$1(FileBasedDataSourceTest.scala:74)
         at 
org.apache.spark.sql.execution.datasources.FileBasedDataSourceTest.$anonfun$withDataSourceFile$1$adapted(FileBasedDataSourceTest.scala:73)
         at 
org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath(SQLHelper.scala:69)
         at 
org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath$(SQLHelper.scala:66)
         at org.apache.spark.sql.QueryTest.withTempPath(QueryTest.scala:34)
         at 
org.apache.spark.sql.execution.datasources.FileBasedDataSourceTest.withDataSourceFile(FileBasedDataSourceTest.scala:73)
         at 
org.apache.spark.sql.execution.datasources.FileBasedDataSourceTest.withDataSourceFile$(FileBasedDataSourceTest.scala:70)
         at 
org.apache.spark.sql.execution.datasources.orc.OrcQueryTest.withDataSourceFile(OrcQuerySuite.scala:68)
         at 
org.apache.spark.sql.execution.datasources.orc.OrcTest.withOrcFile(OrcTest.scala:79)
         at 
org.apache.spark.sql.execution.datasources.orc.OrcTest.withOrcFile$(OrcTest.scala:77)
         at 
org.apache.spark.sql.execution.datasources.orc.OrcQueryTest.withOrcFile(OrcQuerySuite.scala:68)
         at 
org.apache.spark.sql.execution.datasources.orc.OrcQueryTest.$anonfun$new$1(OrcQuerySuite.scala:76)
         at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
         at org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
         at 
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)
         at org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
         at org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
         at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
         at 
org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)
         at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
         at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
         at org.scalatest.Transformer.apply(Transformer.scala:22)
         at org.scalatest.Transformer.apply(Transformer.scala:20)
         at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
         at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
         at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)
         at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
         at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
         at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:69)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
         at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
         at scala.collection.immutable.List.foreach(List.scala:431)
         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
         at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
         at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)
         at org.scalatest.Suite.run(Suite.scala:1114)
         at org.scalatest.Suite.run$(Suite.scala:1096)
         at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
         at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
         at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
         at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)
         at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
         at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
         at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
         at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
         at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1178)
         at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1225)
         at 
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
         at 
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
         at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
         at org.scalatest.Suite.runNestedSuites(Suite.scala:1223)
         at org.scalatest.Suite.runNestedSuites$(Suite.scala:1156)
         at 
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
         at org.scalatest.Suite.run(Suite.scala:1111)
         at org.scalatest.Suite.run$(Suite.scala:1096)
         at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
         at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:47)
         at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1321)
         at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1315)
         at scala.collection.immutable.List.foreach(List.scala:431)
         at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1315)
         at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:992)
         at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:970)
         at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1481)
         at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:970)
         at org.scalatest.tools.Runner$.main(Runner.scala:775)
         at org.scalatest.tools.Runner.main(Runner.scala)
   ```
   
   ```
   java.lang.IllegalStateException: Cannot call methods on a stopped 
SparkContext.
   This stopped SparkContext was created at:
   
   
org.apache.spark.sql.hive.orc.GlutenHiveOrcHadoopFsRelationSuite.&lt;init&gt;(GlutenOrcHadoopFsRelationSuite.scala:24)
   
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
   
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
   
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
   java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
   
java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
   java.base/java.lang.Class.newInstance(Class.java:645)
   org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:66)
   
org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38)
   scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
   scala.collection.Iterator.foreach(Iterator.scala:943)
   scala.collection.Iterator.foreach$(Iterator.scala:943)
   scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   scala.collection.IterableLike.foreach(IterableLike.scala:74)
   scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   scala.collection.TraversableLike.map(TraversableLike.scala:286)
   scala.collection.TraversableLike.map$(TraversableLike.scala:279)
   scala.collection.AbstractTraversable.map(Traversable.scala:108)
   
   The currently active SparkContext was created at:
   
   (No active SparkContext.)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to