Github user nartz commented on the pull request:
https://github.com/apache/spark/pull/2410#issuecomment-58398280
i can, but it seems maybe the scope of this pull-request is too small, as
it needs to encapsulate and explain a bunch of new parameters popping up, and
nuances about when
Github user nartz commented on the pull request:
https://github.com/apache/spark/pull/2410#issuecomment-55909584
using `./bin/spark-submit --driver-memory 3g myscript.py` on the command
line works for me
---
If your project is set up for it, you can reply to this email and have your
GitHub user nartz opened a pull request:
https://github.com/apache/spark/pull/2410
add spark.driver.memory to config docs
It took me a minute to track this down, so I thought it could be useful to
have it in the docs.
I'm unsure if 512mb is the defaul