Our issue has been resolved.

Root cause: Network related issue.

Thanks for each and everyone spent sometime and replied to my questions.

Regards,
Sandeep.v

On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura <[email protected]> wrote:

> Can anyone give solution for my issue?
>
> On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura <[email protected]>
> wrote:
>
>> Exactly but every time it picks randomly. Our datanodes are
>> 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85
>>
>> Namenode  : 192.168.2.80
>>
>> If i restarts the cluster next time it will show 192.168.2.81:50010
>> connection closed
>>
>> On Thu, Apr 9, 2015 at 12:28 AM, Liaw, Huat (MTO) <[email protected]>
>> wrote:
>>
>>>  You can not start 192.168.2.84:50010…. closed by ((192.168.2.x
>>> -datanode))
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:[email protected]]
>>> *Sent:* April 8, 2015 2:39 PM
>>>
>>> *To:* [email protected]
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> We are using this setup from a very long time.We are able to run all the
>>> jobs successfully but suddenly went wrong with namenode.
>>>
>>>
>>>
>>> On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura <[email protected]>
>>> wrote:
>>>
>>> I have also noticed another issue when starting hadoop cluster
>>> start-all.sh command
>>>
>>>
>>>
>>> namenode and datanode daemons are starting.But sometimes one of the
>>> datanode would drop the connection and it shows the message connection
>>> closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
>>> cluster datanode will keeps changing .
>>>
>>>
>>>
>>> for example 1st time when i starts hadoop cluster - 192.168.2.1 -
>>> connection closed
>>>
>>> 2nd time when i starts hadoop cluster - 192.168.2.2-connection closed
>>> .This point again 192.168.2.1 will starts successfuly without any errors.
>>>
>>>
>>>
>>> I couldn't able to figure out the issue exactly.Is issue relates to
>>> network or Hadoop configuration.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <[email protected]>
>>> wrote:
>>>
>>> hadoop fs -put <source> <destination> Copy from remote location to HDFS
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:[email protected]]
>>> *Sent:* April 8, 2015 2:24 PM
>>> *To:* [email protected]
>>> *Subject:* Re: Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Sorry Liaw,I tried same command but its didn't resolve.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.V
>>>
>>>
>>>
>>> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <[email protected]>
>>> wrote:
>>>
>>> Should be hadoop dfs -put
>>>
>>>
>>>
>>> *From:* sandeep vura [mailto:[email protected]]
>>> *Sent:* April 8, 2015 1:53 PM
>>> *To:* [email protected]
>>> *Subject:* Unable to load file from local to HDFS cluster
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> When loading a file from local to HDFS cluster using the below command
>>>
>>>
>>>
>>> hadoop fs -put sales.txt /sales_dept.
>>>
>>>
>>>
>>> Getting the following exception.Please let me know how to resolve this
>>> issue asap.Please find the attached is the logs that is displaying on
>>> namenode.
>>>
>>>
>>>
>>> Regards,
>>>
>>> Sandeep.v
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>

Reply via email to