Github user carsonwang commented on the issue:
https://github.com/apache/spark/pull/22907
What if there is a FetchFailure and Spark reruns some tasks in the previous
succeeded shuffle map stage? That will be a new ShuffleMapStage and we will
still double counting the accumulators, right?--- --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
