Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/22004#discussion_r207752883
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -2369,39 +2369,12 @@ class DAGSchedulerSuite extends SparkFunSuite with
LocalSparkContext with TimeLi
assert(scheduler.getShuffleDependencies(rddE) === Set(shuffleDepA,
shuffleDepC))
}
- test("SPARK-17644: After one stage is aborted for too many failed
attempts, subsequent stages" +
+ test("SPARK-17644: After one stage is aborted for too many failed
attempts, subsequent stages " +
"still behave correctly on fetch failures") {
- // Runs a job that always encounters a fetch failure, so should
eventually be aborted
--- End diff --
Just that the task (the rdd.map call's argument) isn't serializable for the
same reason that the LegacyAccumulatorWrapper failed -- captures the test
class, which has an unserializable AssertionsHelper field in a scalatest
superclass. The problem here is capturing the enclosing test class to begin
with as it's not relevant.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]