allisonwang-db commented on code in PR #43682:
URL: https://github.com/apache/spark/pull/43682#discussion_r1384244315


##########
python/pyspark/worker.py:
##########
@@ -1057,6 +1059,9 @@ def mapper(_, it):
                     yield from eval(*[a[o] for o in args_kwargs_offsets])
                 if terminate is not None:
                     yield from terminate()
+            except StopIteration:
+                if terminate is not None:
+                    yield from terminate()

Review Comment:
   If we are catching the `StopIteration` exception in mapper, do we still need 
the try ... catch block inside the `def func` and `def evaluate` below? 



##########
python/pyspark/worker.py:
##########
@@ -995,6 +995,8 @@ def verify_result(result):
             def func(*a: Any) -> Any:
                 try:
                     return f(*a)
+                except StopIteration:
+                    raise

Review Comment:
   Make sense!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to