juliuszsompolski commented on code in PR #42399:
URL: https://github.com/apache/spark/pull/42399#discussion_r1288917356


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/connect/client/ExecutePlanResponseReattachableIterator.scala:
##########
@@ -297,6 +297,6 @@ class ExecutePlanResponseReattachableIterator(
   /**
    * Retries the given function with exponential backoff according to the 
client's retryPolicy.
    */
-  private def retry[T](fn: => T, currentRetryNum: Int = 0): T =
-    GrpcRetryHandler.retry(retryPolicy)(fn, currentRetryNum)
+  private def retry[T](fn: => T): T =
+    GrpcRetryHandler.retry(retryPolicy)(fn)

Review Comment:
   Discussed offline,
   ```
         override def onError(t: Throwable): Unit = {
           var firstTry = true
           try {
             retry {
               if (firstTry) {
                 firstTry = false
                 throw t // we already failed once, handle first retry
               } else {
                 rawBlockingStub.releaseExecute(requestForRetry)
               }
             }
           } catch {
             case NonFatal(e) =>
               logWarning(s"ReleaseExecute failed with exception: $e.")
           }
   ```
   should work and makes it better because it reuses the retry logic. After the 
first failure, we are already on the GRPC async execution thread, and can 
iterate with a retry loop there instead of recursing. This also brings it 
closer to the python implementation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to