Hi All, I am new to Mesos.  I set up a cluster this weekend with 3 agents,
1 master, Mesos 1.0.0.  The resources show in the Mesos UI and the agents
are all in the Agents tab.  So everything looks good from that vantage
point.

Next I installed Spark 2.0.0 on each agent and the master, in the same path
(/opt/spark) on each machine.  I run the spark-shell from the master like
this:

./spark-shell --master mesos://zk://moe:2181/mesos -c
spark.mesos.executor.home=`pwd`

The shell comes up nicely, however, none of the resources get assigned to
the Spark framework (zeros for everything).

If I try a simple task like

sc.parallelize(0 to 10, 8).count

it fails:

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check
your cluster UI to ensure that workers are registered and have sufficient
resources


I'll post my logs in a little bit if need be.  Hopefully it's a common newb
error and simple fix.

Thank you

Reply via email to