Github user victor-wong commented on the issue:
https://github.com/apache/spark/pull/19824
@CodingCat Yes, this PR wants to solve the same issue in
https://github.com/apache/spark/pull/16542, but I think this is a better way to
solve it.
If a Job failed, I think we should not remove it from its JobSet, so
`jobSet.hasCompleted` will return false. As a result, we will not receive a
StreamingListenerBatchCompleted.
What I want to say is that if a Job is failed, we should consider the Batch
as not completed.
I am not confident about my English, if I am not describing it clear,
please let me know.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]