dillitz commented on code in PR #41829:
URL: https://github.com/apache/spark/pull/41829#discussion_r1251932196
##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/connect/client/SparkConnectClient.scala:
##########
@@ -60,14 +64,36 @@ private[sql] class SparkConnectClient(
new ArtifactManager(userContext, sessionId, channel)
}
+ private val retryParameters: SparkConnectClient.RetryParameters =
configuration.retryParameters
+
+ @tailrec private[client] final def retry[T](fn: => T, retries: Int = 0): T =
{
+ if (retries > retryParameters.max_retries) {
+ throw new IllegalArgumentException(s"retries must not exceed
retryParameters.max_retries")
+ }
+ Try {
Review Comment:
I did it this way to be able to use the @tailrec optimizations, but I now
found a way to do it with plain old try catch.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]