What does "host 192.168.1.99" output?

(Also, slightly OT, but you need to fix this:)

Do not use IPs in your fs location. Do the following instead:

1. Append an entry to /etc/hosts, across all nodes:

192.168.1.99 nn-host.remote nn-host

2. Set fs.default.name to "hdfs://nn-host.remote"

On Tue, Jan 31, 2012 at 3:18 AM, anil gupta <[email protected]> wrote:
> Hi All,
>
> I am using hadoop-0.20.2 and doing a fresh installation of a distributed
> Hadoop cluster along with Hbase.I am having virtualized nodes running
> on top of VMwareESXi5.0 server.
>
> The VM on which namenode is running has two network interfaces.
>  1.  HWaddr 00:0C:29:F8:59:5C
>      IP address:192.168.1.99
>
> 2.  HWaddr: 00:0C:29:F8:59:52
>     IP address:172.18.164.52
>
> Here is the core-site.xml file:
> <property>
> <name>fs.default.name</name>
> <value>hdfs://192.168.1.99:8020</value>
> </property>
>
> As per the above configuration the namenode service should be running
> on 192.168.1.99 but it keeps on running on the IP address: 172.18.164.52
>
> Am i missing any configuration parameters over here?
> Is there anyway to bind Hadoop services to specific ethernet card?
>
> Thanks in advance for help.
> -Anil Gupta



-- 
Harsh J
Customer Ops. Engineer, Cloudera

Reply via email to