GitHub user jose-torres opened a pull request:
https://github.com/apache/spark/pull/20225
[SPARK-23033] Don't use task level retry for continuous processing
## What changes were proposed in this pull request?
Continuous processing tasks will fail on any attempt number greater than 0.
ContinuousExecution will catch these failures and restart globally from the
last recorded checkpoints.
## How was this patch tested?
unit test
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jose-torres/spark no-retry
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/20225.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #20225
----
commit 9f7066e4c9f83476012d896b39844d9ee395bd3e
Author: Jose Torres <jose@...>
Date: 2018-01-04T23:03:45Z
fail if attempt number isn't 0
commit f641be015be4848747ae88265389a5be2e763d00
Author: Jose Torres <jose@...>
Date: 2018-01-05T00:54:07Z
capture task retry and convert it to global retry
commit 761fd26b93f404716887ec3d75d084f9db608c91
Author: Jose Torres <jose@...>
Date: 2018-01-10T23:57:25Z
bring in state update
commit 1bf613f2162ad07289d99c7ef3cbd0a7e2b73558
Author: Jose Torres <jose@...>
Date: 2018-01-10T23:59:51Z
bring in stream test sync
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]