Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/23055#discussion_r235523238
--- Diff: docs/configuration.md ---
@@ -189,7 +189,7 @@ of the most common options to set are:
limited to this amount. If not set, Spark will not limit Python's
memory use
and it is up to the application to avoid exceeding the overhead memory
space
shared with other non-JVM processes. When PySpark is run in YARN or
Kubernetes, this memory
- is added to executor resource requests.
+ is added to executor resource requests. This configuration is not
supported on Windows.
--- End diff --
Maybe add `NOTE: ...`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]