GitHub user ilganeli opened a pull request:

    https://github.com/apache/spark/pull/4665

    SPARK-5570: No docs stating that `new 
SparkConf().set("spark.driver.memory", ...) will not work

    I've updated documentation to reflect true behavior of this setting in 
client vs. cluster mode. 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ilganeli/spark SPARK-5570

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/4665.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #4665
    
----
commit 17b751d0cacc6b2f8af76ff234602296aec013c9
Author: Ilya Ganelin <[email protected]>
Date:   2015-02-17T23:48:45Z

    Updated documentation for driver-memory to reflect its true behavior in 
client vs cluster mode

commit c8995644651d1a0b0a3ecf4b0ae3fb0d78bfe5b4
Author: Ilya Ganelin <[email protected]>
Date:   2015-02-17T23:51:45Z

    Merge remote-tracking branch 'upstream/master' into SPARK-5570

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to