Sankar Mittapally created SPARK-17554: -----------------------------------------
Summary: spark.executor.memory option not working Key: SPARK-17554 URL: https://issues.apache.org/jira/browse/SPARK-17554 Project: Spark Issue Type: Bug Components: Spark Core Reporter: Sankar Mittapally Hi, I am new to spark, I have spark cluster with 5 slaves(Each one have 2 cores and 4g RAM). In spark cluster dashboard I am seeing memory per node is 1gb, I tried to increase it to 2g by using this parameter spark.executor.memory 2g in defaults.conf but it didn't work. I want to increase the memory. Please let me know how to do that. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org