We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode.
On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <[email protected]> wrote: > I have also noticed another issue when starting hadoop cluster > start-all.sh command > > namenode and datanode daemons are starting.But sometimes one of the > datanode would drop the connection and it shows the message connection > closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop > cluster datanode will keeps changing . > > for example 1st time when i starts hadoop cluster - 192.168.2.1 - > connection closed > 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed > .This point again 192.168.2.1 will starts successfuly without any errors. > > I couldn't able to figure out the issue exactly.Is issue relates to > network or Hadoop configuration. > > > > On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <[email protected]> > wrote: > >> hadoop fs -put <source> <destination> Copy from remote location to HDFS >> >> >> >> *From:* sandeep vura [mailto:[email protected]] >> *Sent:* April 8, 2015 2:24 PM >> *To:* [email protected] >> *Subject:* Re: Unable to load file from local to HDFS cluster >> >> >> >> Sorry Liaw,I tried same command but its didn't resolve. >> >> >> >> Regards, >> >> Sandeep.V >> >> >> >> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <[email protected]> >> wrote: >> >> Should be hadoop dfs -put >> >> >> >> *From:* sandeep vura [mailto:[email protected]] >> *Sent:* April 8, 2015 1:53 PM >> *To:* [email protected] >> *Subject:* Unable to load file from local to HDFS cluster >> >> >> >> Hi, >> >> >> >> When loading a file from local to HDFS cluster using the below command >> >> >> >> hadoop fs -put sales.txt /sales_dept. >> >> >> >> Getting the following exception.Please let me know how to resolve this >> issue asap.Please find the attached is the logs that is displaying on >> namenode. >> >> >> >> Regards, >> >> Sandeep.v >> >> >> > >
