antban commented on code in PR #53016:
URL: https://github.com/apache/spark/pull/53016#discussion_r2537447635
##########
python/pyspark/util.py:
##########
@@ -917,6 +918,67 @@ def default_api_mode() -> str:
return "classic"
+class _FaulthandlerHelper:
+ def __init__(self) -> None:
+ self._log_path: Optional[str] = None
+ self._log_file: Optional[TextIO] = None
+ self._periodic_dump = False
+
+ def start(self) -> None:
+ if self._log_path:
+ raise Exception("Fault handler is already registered. No second
registration allowed")
+ self._log_path = os.environ.get("PYTHON_FAULTHANDLER_DIR", None)
+ traceback_dump_interval_seconds = os.environ.get(
+ "PYTHON_TRACEBACK_DUMP_INTERVAL_SECONDS", None
+ )
Review Comment:
Technically yes, that is true, practically this is very strange thing to
save on.
I could think of the test that may update environment variables to test the
functionality, so I would propose to leave it as it is (it is not making things
worse than it was before).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]