LuciferYang commented on PR #42757:
URL: https://github.com/apache/spark/pull/42757#issuecomment-1707754014

   @sunchao After this pr is merged, there are test failures in the daily tests 
of Scala 2.13, we can reproduce the issue offline by executing the following 
command:
   
   ```
   dev/change-scala-version.sh 2.13
   build/sbt "sql/testOnly 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite" -Pscala-2.13
   ```
   
   ```
   [info] - clustered distribution: output partitioning should be 
KeyGroupedPartitioning *** FAILED *** (1 second, 895 milliseconds)
   [info]   
KeyGroupedPartitioning(List(transformexpression(org.apache.spark.sql.connector.catalog.functions.YearsFunction$@92fa663,
 ts#11, None)), 3, List([50], [51], [52]), List([50], [51], [52])) did not 
equal 
KeyGroupedPartitioning(ArraySeq(transformexpression(org.apache.spark.sql.connector.catalog.functions.YearsFunction$@92fa663,
 ts#11, None)), 3, List([51], [50], [52]), ArraySeq([50], [51], [52])) 
(KeyGroupedPartitioningSuite.scala:237)
   [info]   Analysis:
   [info]   KeyGroupedPartitioning(partitionValues: List(0: [50] -> [51], 1: 
[51] -> [50]), uniquePartitionValues: List(0: [50] -> [51], 1: [51] -> [50]))
   [info]   org.scalatest.exceptions.TestFailedException:
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
   [info]   at 
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
   [info]   at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.checkQueryPlan(KeyGroupedPartitioningSuite.scala:237)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.$anonfun$new$4(KeyGroupedPartitioningSuite.scala:103)
   [info]   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
   [info]   at 
org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
   [info]   at 
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)
   [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
   [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
   [info]   at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
   [info]   at 
org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)
   [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
   [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
   [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
   [info]   at 
org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
   [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
   [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.org$scalatest$BeforeAndAfter$$super$runTest(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:213)
   [info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:203)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.runTest(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
   [info]   at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
   [info]   at scala.collection.immutable.List.foreach(List.scala:333)
   [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
   [info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
   [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
   [info]   at 
org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)
   [info]   at org.scalatest.Suite.run(Suite.scala:1114)
   [info]   at org.scalatest.Suite.run$(Suite.scala:1096)
   [info]   at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
   [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
   [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)
   [info]   at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
   [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
   [info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.org$scalatest$BeforeAndAfter$$super$run(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:273)
   [info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:271)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.run(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
   [info]   at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
   [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
   [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   [info]   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   [info]   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   [info]   at java.lang.Thread.run(Thread.java:750)
   ...
   [info] - SPARK-41471: shuffle one side: work with group partition split *** 
FAILED *** (185 milliseconds)
   [info]   Results do not match for query:
   [info]   Timezone: 
sun.util.calendar.ZoneInfo[id="America/Los_Angeles",offset=-28800000,dstSavings=3600000,useDaylight=true,transitions=185,lastRule=java.util.SimpleTimeZone[id=America/Los_Angeles,offset=-28800000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=3,startMonth=2,startDay=8,startDayOfWeek=1,startTime=7200000,startTimeMode=0,endMode=3,endMonth=10,endDay=1,endDayOfWeek=1,endTime=7200000,endTimeMode=0]]
   [info]   Timezone Env: 
   [info]   
   [info]   == Parsed Logical Plan ==
   [info]   'Sort ['id ASC NULLS FIRST, 'purchase_price ASC NULLS FIRST, 
'sale_price ASC NULLS FIRST], true
   [info]   +- 'Project ['id, 'name, 'i.price AS purchase_price#5710, 'p.price 
AS sale_price#5711]
   [info]      +- 'Join Inner, ('i.id = 'p.item_id)
   [info]         :- 'SubqueryAlias i
   [info]         :  +- 'UnresolvedRelation [testcat, ns, items], [], false
   [info]         +- 'SubqueryAlias p
   [info]            +- 'UnresolvedRelation [testcat, ns, purchases], [], false
   [info]   
   [info]   == Analyzed Logical Plan ==
   [info]   id: bigint, name: string, purchase_price: float, sale_price: float
   [info]   Sort [id#5712L ASC NULLS FIRST, purchase_price#5710 ASC NULLS 
FIRST, sale_price#5711 ASC NULLS FIRST], true
   [info]   +- Project [id#5712L, name#5713, price#5714 AS purchase_price#5710, 
price#5717 AS sale_price#5711]
   [info]      +- Join Inner, (id#5712L = item_id#5716L)
   [info]         :- SubqueryAlias i
   [info]         :  +- SubqueryAlias testcat.ns.items
   [info]         :     +- RelationV2[id#5712L, name#5713, price#5714, 
arrive_time#5715] testcat.ns.items testcat.ns.items
   [info]         +- SubqueryAlias p
   [info]            +- SubqueryAlias testcat.ns.purchases
   [info]               +- RelationV2[item_id#5716L, price#5717, time#5718] 
testcat.ns.purchases testcat.ns.purchases
   [info]   
   [info]   == Optimized Logical Plan ==
   [info]   Sort [id#5712L ASC NULLS FIRST, purchase_price#5710 ASC NULLS 
FIRST, sale_price#5711 ASC NULLS FIRST], true
   [info]   +- Project [id#5712L, name#5713, price#5714 AS purchase_price#5710, 
price#5717 AS sale_price#5711]
   [info]      +- Join Inner, (id#5712L = item_id#5716L)
   [info]         :- Filter isnotnull(id#5712L)
   [info]         :  +- RelationV2[id#5712L, name#5713, price#5714] 
testcat.ns.items
   [info]         +- Filter isnotnull(item_id#5716L)
   [info]            +- RelationV2[item_id#5716L, price#5717] 
testcat.ns.purchases
   [info]   
   [info]   == Physical Plan ==
   [info]   AdaptiveSparkPlan isFinalPlan=true
   [info]   +- == Final Plan ==
   [info]      AQEShuffleRead local
   [info]      +- ShuffleQueryStage 1
   [info]         +- Exchange rangepartitioning(id#5712L ASC NULLS FIRST, 
purchase_price#5710 ASC NULLS FIRST, sale_price#5711 ASC NULLS FIRST, 5), 
ENSURE_REQUIREMENTS, [plan_id=22114]
   [info]            +- *(4) Project [id#5712L, name#5713, price#5714 AS 
purchase_price#5710, price#5717 AS sale_price#5711]
   [info]               +- *(4) SortMergeJoin [id#5712L], [item_id#5716L], Inner
   [info]                  :- *(2) Sort [id#5712L ASC NULLS FIRST], false, 0
   [info]                  :  +- *(2) Project [id#5712L, name#5713, price#5714]
   [info]                  :     +- *(2) Filter isnotnull(id#5712L)
   [info]                  :        +- BatchScan testcat.ns.items[id#5712L, 
name#5713, price#5714] class 
org.apache.spark.sql.connector.catalog.InMemoryBaseTable$InMemoryBatchScan 
RuntimeFilters: []
   [info]                  +- *(3) Sort [item_id#5716L ASC NULLS FIRST], false, 0
   [info]                     +- ShuffleQueryStage 0
   [info]                        +- Exchange 
KeyGroupedPartitioning(Vector(item_id#5716L),3,List([4], [3], [1]),List()), 
ENSURE_REQUIREMENTS, [plan_id=22048]
   [info]                           +- *(1) Project [item_id#5716L, price#5717]
   [info]                              +- *(1) Filter isnotnull(item_id#5716L)
   [info]                                 +- BatchScan 
testcat.ns.purchases[item_id#5716L, price#5717] class 
org.apache.spark.sql.connector.catalog.InMemoryBaseTable$InMemoryBatchScan 
RuntimeFilters: []
   [info]   +- == Initial Plan ==
   [info]      Sort [id#5712L ASC NULLS FIRST, purchase_price#5710 ASC NULLS 
FIRST, sale_price#5711 ASC NULLS FIRST], true, 0
   [info]      +- Exchange rangepartitioning(id#5712L ASC NULLS FIRST, 
purchase_price#5710 ASC NULLS FIRST, sale_price#5711 ASC NULLS FIRST, 5), 
ENSURE_REQUIREMENTS, [plan_id=21858]
   [info]         +- Project [id#5712L, name#5713, price#5714 AS 
purchase_price#5710, price#5717 AS sale_price#5711]
   [info]            +- SortMergeJoin [id#5712L], [item_id#5716L], Inner
   [info]               :- Sort [id#5712L ASC NULLS FIRST], false, 0
   [info]               :  +- Project [id#5712L, name#5713, price#5714]
   [info]               :     +- Filter isnotnull(id#5712L)
   [info]               :        +- BatchScan testcat.ns.items[id#5712L, 
name#5713, price#5714] class 
org.apache.spark.sql.connector.catalog.InMemoryBaseTable$InMemoryBatchScan 
RuntimeFilters: []
   [info]               +- Sort [item_id#5716L ASC NULLS FIRST], false, 0
   [info]                  +- Exchange 
KeyGroupedPartitioning(Vector(item_id#5716L),3,List([4], [3], [1]),List()), 
ENSURE_REQUIREMENTS, [plan_id=21852]
   [info]                     +- Project [item_id#5716L, price#5717]
   [info]                        +- Filter isnotnull(item_id#5716L)
   [info]                           +- BatchScan 
testcat.ns.purchases[item_id#5716L, price#5717] class 
org.apache.spark.sql.connector.catalog.InMemoryBaseTable$InMemoryBatchScan 
RuntimeFilters: []
   [info]   
   [info]   == Results ==
   [info]   
   [info]   == Results ==
   [info]   !== Correct Answer - 2 ==   == Spark Answer - 1 ==
   [info]   !struct<>                   
struct<id:bigint,name:string,purchase_price:float,sale_price:float>
   [info]   ![1,aa,40.0,42.0]           [3,bb,10.0,19.5]
   [info]   ![3,bb,10.0,19.5] (QueryTest.scala:244)
   [info]   org.scalatest.exceptions.TestFailedException:
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
   [info]   at 
org.apache.spark.sql.QueryTest$.newAssertionFailedException(QueryTest.scala:234)
   [info]   at org.scalatest.Assertions.fail(Assertions.scala:933)
   [info]   at org.scalatest.Assertions.fail$(Assertions.scala:929)
   [info]   at org.apache.spark.sql.QueryTest$.fail(QueryTest.scala:234)
   [info]   at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:244)
   [info]   at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.$anonfun$new$125(KeyGroupedPartitioningSuite.scala:1219)
   [info]   at 
org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf(SQLHelper.scala:54)
   [info]   at 
org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf$(SQLHelper.scala:38)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.org$apache$spark$sql$test$SQLTestUtilsBase$$super$withSQLConf(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf(SQLTestUtils.scala:247)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf$(SQLTestUtils.scala:245)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.withSQLConf(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.$anonfun$new$124(KeyGroupedPartitioningSuite.scala:1214)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.$anonfun$new$124$adapted(KeyGroupedPartitioningSuite.scala:1210)
   [info]   at scala.collection.immutable.List.foreach(List.scala:333)
   [info]   at 
org.apache.spark.sql.connector.KeyGroupedPartitioningSuite.$anonfun$new$123(KeyGroupedPartitioningSuite.scala:1210)
   [info]   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
   [info]   at 
org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
   [info]   at 
org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)
   [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)
   [info]   at 
org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)
   [info]   at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)
   [info]   at 
org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)
   [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
   [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
   [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
   [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
   [info]   at 
org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
   [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
   [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.org$scalatest$BeforeAndAfter$$super$runTest(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:213)
   [info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:203)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.runTest(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
   [info]   at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
   [info]   at scala.collection.immutable.List.foreach(List.scala:333)
   [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
   [info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
   [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
   [info]   at 
org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)
   [info]   at org.scalatest.Suite.run(Suite.scala:1114)
   [info]   at org.scalatest.Suite.run$(Suite.scala:1096)
   [info]   at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
   [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
   [info]   at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
   [info]   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)
   [info]   at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
   [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
   [info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.org$scalatest$BeforeAndAfter$$super$run(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:273)
   [info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:271)
   [info]   at 
org.apache.spark.sql.connector.DistributionAndOrderingSuiteBase.run(DistributionAndOrderingSuiteBase.scala:33)
   [info]   at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
   [info]   at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
   [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
   [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   [info]   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   [info]   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   [info]   at java.lang.Thread.run(Thread.java:750)
   [info] Run completed in 20 seconds, 935 milliseconds.
   [info] Total number of tests run: 32
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 30, failed 2, canceled 0, ignored 0, pending 0
   [info] *** 2 TESTS FAILED ***
   [error] Failed tests:
   [error]      org.apache.spark.sql.connector.KeyGroupedPartitioningSuite
   ```
   
   run `git revert 9e2aafb13739f9c07f8218cd325c5532063b1a51 ` to revert this pr 
and then executing the above command:
   
   ```
   [info] Run completed in 20 seconds, 857 milliseconds.
   [info] Total number of tests run: 32
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 32, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   ```
   
   https://github.com/apache/spark/actions/runs/6088706713/job/16519965988
   
   <img width="1681" alt="image" 
src="https://github.com/apache/spark/assets/1475305/842357d9-1683-4f60-9898-e906072b00b5";>
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to