If you are using Spark Standalone deployment, make sure you set the WORKER_MEMROY over 20G, and you do have 20G physical memory. Yong
> Date: Tue, 7 Apr 2015 20:58:42 -0700 > From: li...@adobe.com > To: user@spark.apache.org > Subject: EC2 spark-submit --executor-memory > > Dear Spark team, > > I'm using the EC2 script to startup a Spark cluster. If I login and use the > executor-memory parameter in the submit script, the UI tells me that no > cores are assigned to the job and nothing happens. Without executor-memory > everything works fine... Until I get "dag-scheduler-event-loop" > java.lang.OutOfMemoryError: Java heap space, but that's another issue. > > ./bin/spark-submit \ > --class ... \ > --executor-memory 20G \ > /path/to/examples.jar > > Thanks. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/EC2-spark-submit-executor-memory-tp22417.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org >