Github user holdenk commented on a diff in the pull request: https://github.com/apache/spark/pull/21977#discussion_r212757824 --- Diff: docs/configuration.md --- @@ -179,6 +179,15 @@ of the most common options to set are: (e.g. <code>2g</code>, <code>8g</code>). </td> </tr> +<tr> + <td><code>spark.executor.pyspark.memory</code></td> + <td>Not set</td> + <td> + The amount of memory to be allocated to PySpark in each executor, in MiB --- End diff -- We should probably mention that this is added to the executor memory request in Yarn mode.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org