Hi André,

thank you very much for your explanation. I have 4 cores so 128MB for
executor was to less. I thought that
when spark.executor.memory is set to 512m each worker gets 512MB but when I
run application through spark-submit
I have provide --driver-memory argument for spark-submit (setting
spark.executor.memory on runtime doesn't work in this case)
as it is explained by Andrew in this thread:
http://apache-spark-user-list.1001560.n3.nabble.com/Setting-spark-executor-memory-problem-td11429.html

Best regards,
Grzegorz



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/master-local-vs-master-local-tp11434p11529.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to