ulysses-you commented on code in PR #39428:
URL: https://github.com/apache/spark/pull/39428#discussion_r1112810290


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/BasicWriteJobStatsTrackerMetricSuite.scala:
##########
@@ -44,13 +48,14 @@ class BasicWriteJobStatsTrackerMetricSuite extends 
SparkFunSuite with LocalSpark
       // but the executionId is indeterminate in maven test,
       // so the `statusStore.execution(executionId)` API is not used.
       assert(statusStore.executionsCount() == 2)
-      val executionData = statusStore.executionsList()(1)
-      val accumulatorIdOpt =
-        executionData.metrics.find(_.name == "number of dynamic 
part").map(_.accumulatorId)
-      assert(accumulatorIdOpt.isDefined)
-      val numPartsOpt = executionData.metricValues.get(accumulatorIdOpt.get)
-      assert(numPartsOpt.isDefined && numPartsOpt.get == partitions)
-
+      eventually(timeout(10.seconds)) {

Review Comment:
   I think it is related due to the fact CI is failed, but it passed in my 
local.
   
   I this the reason is, the metrics event lightly changed. Before all write 
file metrics come from event `SparkListenerDriverAccumUpdates`.  Now the 
metrics can be changed by both task event and `SparkListenerDriverAccumUpdates`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to