Github user rdblue commented on a diff in the pull request:
https://github.com/apache/spark/pull/21977#discussion_r213121178
--- Diff: docs/configuration.md ---
@@ -179,6 +179,15 @@ of the most common options to set are:
(e.g. <code>2g</code>, <code>8g</code>).
</td>
</tr>
+<tr>
+ <td><code>spark.executor.pyspark.memory</code></td>
+ <td>Not set</td>
+ <td>
+ The amount of memory to be allocated to PySpark in each executor, in
MiB
--- End diff --
I've added "When PySpark is run in YARN, this memory is added to executor
resource requests."
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]