I start the spark master with $SPARK_HOME/sbin/start-master.sh, but I use the 
following to start the workers:
$SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker 
spark://$MASTER:7077

see my blog for more details, although I need to update the posts based on what 
I’ve changed today: 
https://unscrupulousmodifier.wordpress.com/2015/07/20/running-spark-as-a-job-on-a-grid-engine-hpc-cluster-part-1

—Ken
> On Dec 21, 2015, at 4:00 PM, MegaLearn <j...@megalearningllc.com> wrote:
> 
> How do you start the Spark daemon, directly?
> https://issues.apache.org/jira/browse/SPARK-11570
> 
> If that's the case solution is to start by script, but I didn't read the
> whole thing. In my little world (currently 2-machine cluster soon move to
> 300) I have the same issue with 1.4.1, and I thought it was how our
> /etc/hosts were setup until I just read that bug. Though I use start-master
> and start-slave to start it so that's probably not my problem,
> 
> So if the users use the IP address does it work?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Applicaiton-Detail-UI-change-tp25756p25759.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

Reply via email to