HyukjinKwon commented on issue #21977: [SPARK-25004][CORE] Add spark.executor.pyspark.memory limit. URL: https://github.com/apache/spark/pull/21977#issuecomment-455854590 BTW, I don't think many people use `spark.python.worker.memory` since arguably RDD APIs are being less used, and considering apparently all reviewers (including me) missed this configuration. I think we can just remove and replace it to `spark.executor.pyspark.memory` with a migration guide note. If you guys agree on this approach, I'll make a followup PR. I actually have a bit of works done during this investigation.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org