Github user advancedxy commented on a diff in the pull request:
https://github.com/apache/spark/pull/21165#discussion_r186468071
--- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala ---
@@ -212,9 +212,15 @@ case object TaskResultLost extends TaskFailedReason {
* Task was killed intentionally and needs to be rescheduled.
*/
@DeveloperApi
-case class TaskKilled(reason: String) extends TaskFailedReason {
+case class TaskKilled(
+ reason: String,
+ accumUpdates: Seq[AccumulableInfo] = Seq.empty,
+ private[spark] val accums: Seq[AccumulatorV2[_, _]] = Nil)
--- End diff --
Hi @cloud-fan, I have looked at how to remove `Seq[AccumulableInfo]`
tonight.
It turns out that we cannot because `JsonProtocol` calls
`taskEndReasonFromJson` to reconstruct `TaskEndReason`s. Since `AccumulatorV2`
is an abstract class, we cannot simply construct `AccumulatorV2`s from json.
Even we are promoting `AccumulatorV2`, we still need `AccumulableInfo`
when (de)serializing json.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]