Hey Jon,

Since you're running on YARN, the Worker shouldn't be involved.  Are you
able to go to the YARN ResourceManager web UI and click on "nodes" in the
top left.  Does that node show up in the list?  If you click on it, what's
shown under "Total Pmem allocated for Container"?

It also might be worthwhile to try restarting the NodeManager on that node.

-Sandy

On Fri, Oct 3, 2014 at 2:59 PM, jonathan.keebler <jkeeble...@gmail.com>
wrote:

> Hi all,
>
> We're running Spark 1.0 on CDH 5.1.2.  We're using Spark in YARN-client
> mode.
>
> We're seeing that one of our nodes is not being assigned any tasks, and no
> resources (RAM,cpu) are being used on this node.  In the CM UI this worker
> node is in good health and the spark Worker process is running, along with
> the yarn-NODEMANAGER and hdfs-DATANODE .
>
> We've tried re-starting the Spark Worker process while the application is
> running, but there are still no tasks assigned to the worker.
>
> Any hints or thoughts on this?  We can wait until the current job finishes
> and restart spark, yarn, etc, but I wonder if there is a way to make the
> currently running job recognize the worker & begin assigning tasks.
>
> Thanks!
> - Jon
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Worker-with-no-Executor-YARN-client-mode-tp15708.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to