Hi Andrew,
Thank you very much for your solution, it works like a charm, and for very
clear explanation.
Grzegorz
(Clarification: you'll need to pass in --driver-memory not just for local
mode, but for any application you're launching with "client" deploy mode)
2014-08-05 9:24 GMT-07:00 Andrew Or :
> Hi Grzegorz,
>
> For local mode you only have one executor, and this executor is your
> driver, so you need
Hi Grzegorz,
For local mode you only have one executor, and this executor is your
driver, so you need to set the driver's memory instead. *That said, in
local mode, by the time you run spark-submit, a JVM has already been
launched with the default memory settings, so setting "spark.driver.memory"
Hi,
I wanted to make simple Spark app running in local mode with 2g
spark.executor.memory and 1g for caching. But following code:
val conf = new SparkConf()
.setMaster("local")
.setAppName("app")
.set("spark.executor.memory", "2g")
.set("spark.storage.memoryFraction", "0.5")
v