Github user ivoson commented on a diff in the pull request:
https://github.com/apache/spark/pull/22252#discussion_r213544524
--- Diff: docs/configuration.md ---
@@ -152,7 +152,7 @@ of the most common options to set are:
<td><code>spark.driver.memory</code></td>
<td>1g</td>
<td>
- Amount of memory to use for the driver process, i.e. where
SparkContext is initialized, in MiB
+ Amount of memory to use for the driver process, i.e. where
SparkContext is initialized, in bytes
--- End diff --
@xuanyuanking @HyukjinKwon @srowen thanks for your reply. I've also noticed
the code above, but the 'DRIVER_MEMORY' and 'EXECUTOR_MEMORY' in the
config/package.scala never used, maybe this is for future usage I think. The
code below shows how the conf is used for now, please take a look.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L465
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L1130
https://github.com/ivoson/spark/blob/master/launcher/src/main/java/org/apache/spark/launcher/SparkSubmitCommandBuilder.java#L265
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]