WamBamBoozle commented on pull request #31072: URL: https://github.com/apache/spark/pull/31072#issuecomment-756261437
> can we have something like spark.python.daemon.module and spark.python.worker.module to allow custom worker and daemon instead? No. Those configurations put a lot of responsibility on the user. Further, it is very prone to breakage due to changes in Spark. What this PR proposes is a narrow thing: initializing the daemon. It doesn't change the logic of either the daemon or the worker. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
