Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/4665#discussion_r25036932
--- Diff: docs/configuration.md ---
@@ -135,6 +139,10 @@ of the most common options to set are:
Having a high limit may cause out-of-memory errors in driver (depends
on spark.driver.memory
and memory overhead of objects in JVM). Setting a proper limit can
protect the driver from
out-of-memory errors.
+
+ <br /><em>Note:</em> In client mode, this config must not be set
through the <code>SparkConf</code>
+ directly in your application, because the driver JVM has already
started at that point.
+ Instead, please set this through the default properties file.</td>
--- End diff --
this doesn't actually apply here. `maxResultSize` can be read after the JVM
is started, so please remove this line here.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]