Safest way would be to first shutdown HDFS and then shutdown Spark (call
stop-all.sh would do) and then shutdown the machines.

You can execute the following command to disable safe mode:

*hadoop fs -safemode leave*



Thanks
Best Regards

On Sat, Jan 17, 2015 at 8:31 AM, Su She <suhsheka...@gmail.com> wrote:

> Hello Everyone,
>
> I am encountering trouble running Spark applications when I shut down my
> EC2 instances. Everything else seems to work except Spark. When I try
> running a simple Spark application, like sc.parallelize() I get the message
> that hdfs name node is in safemode.
>
> Has anyone else had this issue? Is there a proper protocol I should be
> following to turn off my spark nodes?
>
> Thank you!
>
>
>

Reply via email to