Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19717#discussion_r155016979
--- Diff: docs/running-on-yarn.md ---
@@ -234,18 +234,11 @@ To use a custom metrics.properties for the
application master and executors, upd
The amount of off-heap memory (in megabytes) to be allocated per
executor. This is memory that accounts for things like VM overheads, interned
strings, other native overheads, etc. This tends to grow with the executor size
(typically 6-10%).
</td>
</tr>
-<tr>
- <td><code>spark.yarn.driver.memoryOverhead</code></td>
--- End diff --
Look for `configsWithAlternatives` in `SparkConf.scala`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]