[
https://issues.apache.org/jira/browse/SPARK-39967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
jingxiong zhong resolved SPARK-39967.
-------------------------------------
Resolution: Fixed
New version not reproduced
> Instead of using the scalar tasksSuccessful, use the successful array to
> calculate whether the task is completed
> ----------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-39967
> URL: https://issues.apache.org/jira/browse/SPARK-39967
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.4.3, 2.4.6
> Reporter: jingxiong zhong
> Priority: Critical
> Attachments: spark1-1.png, spark2.png, spark3-1.png
>
>
> When counting the number of successful tasks in the stage of spark, spark
> uses the indicator of `tasksSuccessful`, but in fact, the success or failure
> of tasks is based on the array of `successful`. Through the log I added, it
> is found that the number of failed tasks counted by `tasksSuccessful` is
> inconsistent with the number of failures stored in the array of `successful`.
> We should take `successful` as the standard.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]