I have Modified my Yarn-site to include the following properties , 

<property>
           <name>yarn.nodemanager.resource.memory-mb</name>
           <value>4096</value>
      </property>

      <property>
           <name>yarn.scheduler.minimum-allocation-mb</name>
           <value>256</value>
      </property>

      <property>
           <name>yarn.scheduler.maximum-allocation-mb</name>
           <value>2250</value>
      </property>

And then issued the following command to run spark-shell in yarn client mode
, 

spark-shell --executor-memory 512m --driver-memory 1g --num-executors 2

But i am still unable to start the Spark context and it fails with the same
error. Can someone help me on explaining how to set the core, executor
memory , driver memory depending upon once cluster configuration. I have
specified my machine config (Ram and Disk space) in previous post.

I hope someone can get me over this hurdle , thanks again




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-on-Yarn-Client-Cluster-mode-tp26691p26739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to