dongjoon-hyun commented on code in PR #43814:
URL: https://github.com/apache/spark/pull/43814#discussion_r1394826363


##########
docs/running-on-kubernetes.md:
##########
@@ -1203,17 +1203,17 @@ See the [configuration page](configuration.html) for 
information on Spark config
   <td>3.0.0</td>
 </tr>
 <tr>
-  <td><code>memoryOverheadFactor</code></td>
+  <td><code>spark.kubernetes.memoryOverheadFactor</code></td>
   <td><code>0.1</code></td>
   <td>
-    This sets the Memory Overhead Factor that will allocate memory to non-JVM 
memory, which includes off-heap memory allocations, non-JVM tasks, various 
systems processes, and <code>tmpfs</code>-based local directories when 
<code>local.dirs.tmpfs</code> is <code>true</code>. For JVM-based jobs this 
value will default to 0.10 and 0.40 for non-JVM jobs.
+    This sets the Memory Overhead Factor that will allocate memory to non-JVM 
memory, which includes off-heap memory allocations, non-JVM tasks, various 
systems processes, and <code>tmpfs</code>-based local directories when 
<code>spark.kubernetes.local.dirs.tmpfs</code> is <code>true</code>. For 
JVM-based jobs this value will default to 0.10 and 0.40 for non-JVM jobs.
     This is done as non-JVM tasks need more non-JVM heap space and such tasks 
commonly fail with "Memory Overhead Exceeded" errors. This preempts this error 
with a higher default.
     This will be overridden by the value set by 
<code>spark.driver.memoryOverheadFactor</code> and 
<code>spark.executor.memoryOverheadFactor</code> explicitly.

Review Comment:
   It seems that you are looking at the first commit. I removed K8s part from 
this PR completely at the latest commit.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to