Sankar Mittapally created SPARK-17554:

             Summary: spark.executor.memory option not working
                 Key: SPARK-17554
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: Sankar Mittapally


 I am new to spark, I have spark cluster with 5 slaves(Each one have 2 cores 
and 4g RAM). In spark cluster dashboard I am seeing memory per node is 1gb, I 
tried to increase it to 2g by using this parameter spark.executor.memory  2g in 
defaults.conf but it didn't work. I want to increase the memory. Please let me 
know how to do that.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to