HyukjinKwon commented on a change in pull request #23664: [MINOR][DOCS] Add a 
note that 'spark.executor.pyspark.memory' is dependent on 'resource'
URL: https://github.com/apache/spark/pull/23664#discussion_r251661606
 
 

 ##########
 File path: docs/configuration.md
 ##########
 @@ -190,8 +190,8 @@ of the most common options to set are:
     and it is up to the application to avoid exceeding the overhead memory 
space
     shared with other non-JVM processes. When PySpark is run in YARN or 
Kubernetes, this memory
     is added to executor resource requests.
-
-    NOTE: Python memory usage may not be limited on platforms that do not 
support resource limiting, such as Windows.
+    <br/>
+    <em>Note:</em> This feature is dependent on Python's `resource` module; 
therefore, the behaviors and limitations are inherited.
 
 Review comment:
   But this covers all other possibilities caused by this module. Wanted to 
avoid that we keep fixing this doc because of `resource` module.
   
   Like, at least people check if `resource` module works in their local.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to