Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-27 Thread Su She
Thanks Akhil! 1) I had to do sudo -u hdfs hdfs dfsadmin -safemode leave a) I had created a user called hdfs with superuser privileges in Hue, hence the double hdfs. 2) Lastly, I know this is getting a bit off topic, but this is my etc/hosts file: 127.0.0.1 localhost.localdomain

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-26 Thread Su She
Hello Sean and Akhil, I shut down the services on Cloudera Manager. I shut them down in the appropriate order and then stopped all services of CM. I then shut down my instances. I then turned my instances back on, but I am getting the same error. 1) I tried hadoop fs -safemode leave and it said

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-26 Thread Akhil Das
Command would be: hadoop dfsadmin -safemode leave If you are not able to ping your instances, it can be because of you are blocking all the ICMP requests. Im not quiet sure why you are not able to ping google.com from your instances. Make sure the internal IP (ifconfig) is proper in the

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-22 Thread Sean Owen
If you are using CDH, you would be shutting down services with Cloudera Manager. I believe you can do it manually using Linux 'services' if you do the steps correctly across your whole cluster. I'm not sure if the stock stop-all.sh script is supposed to work. Certainly, if you are using CM, by far

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-17 Thread Su She
Thanks Akhil and Sean for the responses. I will try shutting down spark, then storage and then the instances. Initially, when hdfs was in safe mode, I waited for 1 hour and the problem still persisted. I will try this new method. Thanks! On Sat, Jan 17, 2015 at 2:03 AM, Sean Owen

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-17 Thread Akhil Das
Safest way would be to first shutdown HDFS and then shutdown Spark (call stop-all.sh would do) and then shutdown the machines. You can execute the following command to disable safe mode: *hadoop fs -safemode leave* Thanks Best Regards On Sat, Jan 17, 2015 at 8:31 AM, Su She

Re: HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-17 Thread Sean Owen
You would not want to turn off storage underneath Spark. Shut down Spark first, then storage, then shut down the instances. Reverse the order when restarting. HDFS will be in safe mode for a short time after being started before it becomes writeable. I would first check that it's not just that.

HDFS Namenode in safemode when I turn off my EC2 instance

2015-01-16 Thread Su She
Hello Everyone, I am encountering trouble running Spark applications when I shut down my EC2 instances. Everything else seems to work except Spark. When I try running a simple Spark application, like sc.parallelize() I get the message that hdfs name node is in safemode. Has anyone else had this