dillitz opened a new pull request, #41829: URL: https://github.com/apache/spark/pull/41829
### What changes were proposed in this pull request? This PR introduces a configurable retry mechanism for the Scala `SparkConnectClient`. Parameters for the exponential backoff and a filter for exceptions to retry are passed to the client via the existing (extended) `Configuration` class. By default, no exception triggers a retry - so this change does not alter current behavior. One might want to move the retry logic into the GRPC stub that will potentially be introduced [here](https://github.com/apache/spark/pull/41743). ### Why are the changes needed? There are a few existing exceptions that one might want to handle with a retry. For example, this would allow one to not exit with an exception when executing a command while the cluster is still starting. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Tests included. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
