wangshengjie123 commented on a change in pull request #34834:
URL: https://github.com/apache/spark/pull/34834#discussion_r779375709
##########
File path: docs/configuration.md
##########
@@ -2461,9 +2461,10 @@ Apart from these, the following properties are also
available, and may be useful
<td><code>spark.task.maxFailures</code></td>
<td>4</td>
<td>
- Number of failures of any particular task before giving up on the job.
+ Number of continuous failures of any particular task before giving up on
the job.
The total number of failures spread across different tasks will not cause
the job
- to fail; a particular task has to fail this number of attempts.
+ to fail; a particular task has to fail this number of attempts
continuously.
+ Be aware that if one of attempts succeed, current count of task failures
will be reset.
Review comment:
Got that,thanks
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]