In the spark-env.sh example file, the comments indicate that the spark.driver.memory is the memory for the master in YARN mode. None of that actually makes any sense…
In any case, I’m using spark in a standalone mode, running the driver on a separate machine from the master. I have a few questions regarding that: Does the spark.driver.memory only work in YARN mode? Does the value apply to the master or the driver? If the memory applies to the driver, what is that memory used for? Does it make sense to change it based on what kind of machine the driver is running on? (We have both 256GB nodes and 128GB nodes available for use as the driver) Thanks, Ken --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org