[ 
https://issues.apache.org/jira/browse/SPARK-38293?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17496422#comment-17496422
 ] 

angerszhu commented on SPARK-38293:
-----------------------------------

cc [~dongjoon] [~hyukjin.kwon] Have meet this for some times. 

> Fix flaky text of  HealthTrackerIntegrationSuite
> ------------------------------------------------
>
>                 Key: SPARK-38293
>                 URL: https://issues.apache.org/jira/browse/SPARK-38293
>             Project: Spark
>          Issue Type: Task
>          Components: Spark Core
>    Affects Versions: 3.2.1
>            Reporter: angerszhu
>            Priority: Major
>
> {code:java}
> [info] HealthTrackerIntegrationSuite:
> [info] - If preferred node is bad, without excludeOnFailure job will fail 
> (120 milliseconds)
> [info] - With default settings, job can succeed despite multiple bad 
> executors on node (3 seconds, 78 milliseconds)
> [info] - Bad node with multiple executors, job will still succeed with the 
> right confs *** FAILED *** (61 milliseconds)
> [info]   Map() did not equal Map(0 -> 42, 5 -> 42, 1 -> 42, 6 -> 42, 9 -> 42, 
> 2 -> 42, 7 -> 42, 3 -> 42, 8 -> 42, 4 -> 42) 
> (HealthTrackerIntegrationSuite.scala:94)
> [info]   Analysis:
> [info]   HashMap(0: -> 42, 1: -> 42, 2: -> 42, 3: -> 42, 4: -> 42, 5: -> 42, 
> 6: -> 42, 7: -> 42, 8: -> 42, 9: -> 42)
> [info]   org.scalatest.exceptions.TestFailedException:
> [info]   at 
> org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
> [info]   at 
> org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
> [info]   at 
> org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
> [info]   at 
> org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to