Hi, all.
I'm trying to connect to a remote cluster from my machine, using spark
0.7.3. In conf/spark-env.sh, I've set MASTER, SCALA_HOME, SPARK_MASTER_IP,
and SPARK_MASTER_PORT.
When I try to run a job, it starts, but never gets anywhere, and I keep
getting the following error message:
13/12/06 13:37:20 WARN cluster.ClusterScheduler: Initial job has not
accepted any resources; check your cluster UI to ensure that workers are
registered
I look at the cluster UI in a browser, and it says it has 8 workers
registered, all alive.
What does this error mean? I assume I'm missing something in the setup -
does anyone know what?
Thanks in advance,
-Nathan Kronenfeld