The Master doesn't do work. I don't quite understand the rest; there's
not a "Spark slave" role. You can have master and workers, and even
your driver, on one machine; what's the error?

On Sun, Aug 31, 2014 at 12:53 AM, arthur.hk.c...@gmail.com
<arthur.hk.c...@gmail.com> wrote:
> Hi,
>
> I have few questions about Spark Master and Slave setup:
>
> Here, I have 5 Hadoop nodes (n1, n2, n3, n4, and n5 respectively), at the
> moment I run Spark under these nodes:
> n1:    Hadoop Active Name node, Hadoop Slave Spark Active Master
> n2:    Hadoop Standby Name Node, Hadoop Salve  Spark Slave
> n3:     Hadoop Salve Spark Slave
> n4:     Hadoop Salve Spark Slave
> n5:   Hadoop Salve Spark Slave
>
> Questions:
> Q1: If I set n1 as both Spark Master and Spark Slave, I cannot start the
> Spark Cluster. does it mean that, unlike Hadoop, I cannot use the same
> machine to be both MASTER and SLAVE in Spark?
> n1:    Hadoop Active Name node,  Hadoop Slave  Spark Active Master Spark
> Slave    (failed to Start Spark)
> n2:    Hadoop Standby Name Node,  Hadoop Salve  Spark Slave
> n3:     Hadoop Salve Spark Slave
> n4:     Hadoop Salve Spark Slave
> n5:     Hadoop Salve Spark Slave
>
> Q2: I am planning Spark HA, what if I use n2 as "Spark Standby Master and
> Spark Slaveā€? is Spark allowed to run Standby Master and Slave under same
> machine?
> n1:    Hadoop Active Name node,  Hadoop Slave  Spark Active Master
> n2:    Hadoop Standby Name Node,  Hadoop Salve  Spark Standby Master Spark
> Slave
> n3:     Hadoop Salve Spark Slave
> n4:     Hadoop Salve Spark Slave
> n5:    Hadoop Salve Spark Slave
>
> Q3: Does the Spark Master node do actual computation work like a worker or
> just a pure monitoring node?
>
> Regards
> Arthur

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to