There is no firewall problem.... I've disabled firewall... 
Can u find the problem?


Eugene Weinstein wrote:
> 
> It could be a firewall issue -- for me the firewall was blocking port
> 50010.
> 
> Eugene
> 
> On 3/1/07, [EMAIL PROTECTED] <
> [EMAIL PROTECTED]> wrote:
>>
>>
>>
>> Hello,
>>
>> I'm evaluating Hadoop for a large application.
>>
>> When running the wordcount example, I experience an issue where my
>> master node cannot open a socket to port 50010 of my remote slave node.
>>
>> When I run the example with only my master in the slaves file, it
>> works fine.  When I add a second machine, i get the error.
>>
>> Here is my config:
>>
>> Running Hadoop-0.11.0
>>
>> Server for master (10.229.62.6)
>> Remote slave (10.229.62.56)
>>
>> My conf/slaves file content
>>
>> ===================
>> localhost
>> [EMAIL PROTECTED]
>> ====================
>>
>> My masters file content
>>
>> ==============
>> localhost
>> ==============
>>
>> I've set the environmental variable for HADOOP_HOME and JAVA_HOME
>>
>>
>> Using the standard hadoop-default.xml.
>>
>> Here's my hadoop-site.xml (which is the same on both machines):
>>
>> <?xml version="1.0"?>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>
>> <!-- Put site-specific property overrides in this file. -->
>>
>> <configuration>
>>
>> <property>
>> <name>fs.default.name</name>
>> <value>10.229.62.6:50010</value>
>> </property>
>>
>> <property>
>> <name>mapred.job.tracker</name>
>> <value>10.229.62.6:50011</value>
>> </property>
>>
>> <property>
>> <name>dfs.replication</name>
>> <value>2</value>
>> </property>
>>
>> <property>
>> <name>dfs.datanode.port</name>
>> <value>50010</value>
>> </property>
>>
>> <property>
>> <name>dfs.info.port</name>
>> <value>50070</value>
>> </property>
>>
>> <property>
>> <name>dfs.name.dir</name>
>> <value>/tmp/hadoop-146736/dfs/name</value>
>> </property>
>>
>> <property>
>> <name>dfs.data.dir</name>
>> <value>/tmp/hadoop-146736/dfs/data</value>
>> </property>
>>
>> <property>
>> <name>dfs.client.buffer.dir</name>
>> <value>/tmp/hadoop-146736/dfs/tmp</value>
>> </property>
>>
>> <property>
>> <name>mapred.local.dir</name>
>> <value>/tmp/hadoop-jaya/mapred/local</value>
>> </property>
>>
>> <property>
>> <name>mapred.system.dir</name>
>> <value>/tmp/hadoop-jaya/mapred/system</value>
>> </property>
>>
>> <property>
>> <name>mapred.temp.dir</name>
>> <value>/tmp/hadoop-jaya/mapred/temp</value>
>> </property>
>>
>> <property>
>> <name>mapred.job.tracker.info.port</name>
>> <value>50030</value>
>> </property>
>>
>> <property>
>> <name>mapred.task.tracker.report.port</name>
>> <value>50050</value>
>> </property>
>>
>> </configuration>
>>
>> Here is the terminal output on the server:
>>
>> [EMAIL PROTECTED] hadoop-0.11.0]$ bin/hadoop dfs input
>> 07/03/01 14:43:22 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 1 time(s).
>> 07/03/01 14:43:23 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 2 time(s).
>> 07/03/01 14:43:24 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 3 time(s).
>> 07/03/01 14:43:25 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 4 time(s).
>> 07/03/01 14:43:26 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 5 time(s).
>> 07/03/01 14:43:27 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 6 time(s).
>> 07/03/01 14:43:28 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 7 time(s).
>> 07/03/01 14:43:29 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 8 time(s).
>> 07/03/01 14:43:30 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 9 time(s).
>> 07/03/01 14:43:31 INFO ipc.Client: Retrying connect to server:
>> /10.229.62.6:50010. Already tried 10 time(s).
>> Bad connection to FS. command aborted.
>>
>> Can anyone help me identify the issue? am i not doing any other work
>> which
>> is required... Please help me finding and solving the issue...
>>
>> Thanks & Regards,
>> Jayalakshmi
>>
>>
>> This e-mail and any files transmitted with it are for the sole use of the
>> intended recipient(s) and may contain confidential and privileged
>> information.
>> If you are not the intended recipient, please contact the sender by reply
>> e-mail and destroy all copies of the original message.
>> Any unauthorized review, use, disclosure, dissemination, forwarding,
>> printing or copying of this email or any action taken in reliance on this
>> e-mail is strictly
>> prohibited and may be unlawful.
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Running-hadoop-in-2-systems-tf3325881.html#a9263757
Sent from the Hadoop Users mailing list archive at Nabble.com.

Reply via email to