rdblue commented on a change in pull request #23664: [MINOR][DOCS] Add a note 
that 'spark.executor.pyspark.memory' is dependent on 'resource'
URL: https://github.com/apache/spark/pull/23664#discussion_r252519261
 
 

 ##########
 File path: docs/configuration.md
 ##########
 @@ -190,8 +190,8 @@ of the most common options to set are:
     and it is up to the application to avoid exceeding the overhead memory 
space
     shared with other non-JVM processes. When PySpark is run in YARN or 
Kubernetes, this memory
     is added to executor resource requests.
-
-    NOTE: Python memory usage may not be limited on platforms that do not 
support resource limiting, such as Windows.
+    <br/>
+    <em>Note:</em> This feature is dependent on Python's `resource` module; 
therefore, the behaviors and limitations are inherited.
 
 Review comment:
   I think that wording is fine. Sorry for the delay.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to