Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/4018#discussion_r22882991
--- Diff: docs/running-on-yarn.md ---
@@ -30,6 +30,22 @@ Most of the configs are the same for Spark on YARN as
for other deployment modes
</td>
</tr>
<tr>
+ <td><code>spark.driver.cores</code></td>
+ <td>1</td>
+ <td>
+ Number of cores to use for the YARN Application Master in cluster mode.
--- End diff --
it's weird how the description for `spark.driver.cores` only talks about
the YARN application master. Maybe this should be more like the following:
Number of cores used by the driver in YARN cluster mode. Since the driver
is run in the same JVM as the YARN Application Master in cluster mode, this
also controls the cores used by the YARN AM. In client mode, use ... instead.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]