Hi,

I am new to the world of Hadoop and this is my first post in here.
Recently i have setup a Multi-node Hadoop cluster (3 Nodes Cluster) with HA
feature for Namenode & ResourceManager with Zookeeper server.

*Daemons running in NN1 (ptfhadoop01v) :*

2945 JournalNode
3137 DFSZKFailoverController
6385 Jps
3338 NodeManager
22730 QuorumPeerMain
2747 DataNode
3228 ResourceManager
2636 NameNode

*Daemons  running for NN2 (ntpcam01v) :*

19620 Jps
3894 QuorumPeerMain
16966 ResourceManager
16808 NodeManager
16475 DataNode
16572 JournalNode
17101 NameNode
16702 DFSZKFailoverController

*Daemons running for DN1 (ntpcam03v) :*

12228 QuorumPeerMain
29060 NodeManager
28858 DataNode
29644 Jps
28956 JournalNode

*ptfhadoop01v* - Active Namenode & ResourceManager
*ntpcam01v* - Standby Namenode & ResourceManager
*ntpcam03v* - Datanode

Now , i have installed Apache spark *version 1.6.0* and installed in *NN1*
(ptfhadoop01v).
I have copied over spark assembly jar into HDFS and set *SPARK_JAR* in
~/.bashrc file.

spark-env.sh : (*I have set only these parameters in the spark-env.sh*)

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}
export SPARK_YARN_QUEUE=dev
export SPARK_MASTER_IP=ptfhadoop01v
export SPARK_WORKER_CORES=2
export SPARK_WORKER_MEMORY=500mb
export SPARK_WORKER_INSTANCES=2

*I have not set any spark-defaults.conf file*

I am able to start spark-shell in local mode by issuing the following
command ,
$ *spark-shell* (From NN1)

But when i try to initiate the same in yarn-client mode it always fails ,
the command i used is , 
$ *spark-shell --master yarn-client*

Spark-Error.txt
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26691/Spark-Error.txt>
  

Can anyone tell me what am i doing wrong , Do i need to install spark on
each node in cluster ?
How do i start the spark-shell in yarn-client mode.

Thanks in advance.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-on-Yarn-Client-Cluster-mode-tp26691.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to