juliuszsompolski commented on code in PR #43800:
URL: https://github.com/apache/spark/pull/43800#discussion_r1392623033


##########
python/pyspark/sql/connect/client/retries.py:
##########
@@ -248,8 +248,29 @@ class RetryException(Exception):
 
 
 class DefaultPolicy(RetryPolicy):
-    def __init__(self, **kwargs):  # type: ignore[no-untyped-def]
-        super().__init__(**kwargs)
+    # Please synchronize changes here with Scala side
+    # GrpcRetryHandler.scala
+    #
+    # Note: the number of retries is selected so that the maximum tolerated 
wait
+    # is guaranteed to be at least 10 minutes
+    
+    def __init__(
+        self,
+        max_retries: Optional[int] = 15,
+        backoff_multiplier: float = 4.0,
+        initial_backoff: int = 50,
+        max_backoff: Optional[int] = 60000,
+        jitter: int = 500,
+        min_jitter_threshold: int = 2000,
+    ):

Review Comment:
   should it be just `def __init__(self)` and pass these arguments to the 
superclass?
   In the description you mention that the suggested way to use RetryPolicies 
is to make a subclass.
   So I think someone should rather create a new subclass of RetryPolicy than 
initiate DefaultPolicy with different parameters?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to