Hi,

I found the solution for the problem I have posted, I would post the
resolution here so that others may benefit from this.

the incompatibility was showing on my slave was because of incompatible java
installed on my slave. I removed the current java installation from slave
and installed the same version as I have on my master and that solved the
problem.

Thanks all, for your responses.

Ved

2008/3/5 Ved Prakash <[EMAIL PROTECTED]>:

> Hi Miles,
>
> Yes, I have hadoop-0.15.2 installed on both my systems.
>
> Ved
>
> 2008/3/5 Miles Osborne <[EMAIL PROTECTED]>:
>
> Did you use exactly the same version of Hadoop on each and every node?
> >
> > Miles
> >
> > On 05/03/2008, Ved Prakash <[EMAIL PROTECTED]> wrote:
> > >
> > > Hi Zhang,
> > >
> > > Thanks for your reply, I tried this but no use. It still throws up
> > > Incompatible build versions.
> > >
> > > I removed the dfs local directory on slave and issued start-dfs.sh on
> > > server, and when I checked the logs it showed up with the same
> > problem.
> > >
> > > Do you guys need some more information from my side to have a better
> > > understanding about the problem.
> > >
> > > Please let me know,
> > >
> > > Thanks
> > >
> > > Ved
> > >
> > > 2008/3/5 Zhang, Guibin <[EMAIL PROTECTED]>:
> > >
> > > > You can delete the DFS local dir in the slave (The local dictionary
> > > should
> > > > be ${hadoop.tmp.dir}/dfs/) and try again.
> > > >
> > > >
> > > > -----邮件原件-----
> > > > 发件人: Ved Prakash [mailto:[EMAIL PROTECTED]
> > > > 发送时间: 2008年3月5日 14:51
> > > > 收件人: [email protected]
> > > > 主题: clustering problem
> > > >
> > > > Hi Guys,
> > > >
> > > > I am having problems creating clusters on 2 machines
> > > >
> > > > Machine configuration :
> > > > Master : OS: Fedora core 7
> > > >             hadoop-0.15.2
> > > >
> > > > hadoop-site.xml listing
> > > >
> > > > <configuration>
> > > >  <property>
> > > >    <name>fs.default.name</name>
> > > >    <value>anaconda:50001</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>mapred.job.tracker</name>
> > > >    <value>anaconda:50002</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>dfs.replication</name>
> > > >    <value>2</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>dfs.secondary.info.port</name>
> > > >    <value>50003</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>dfs.info.port</name>
> > > >    <value>50004</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>mapred.job.tracker.info.port</name>
> > > >    <value>50005</value>
> > > >  </property>
> > > >  <property>
> > > >    <name>tasktracker.http.port</name>
> > > >    <value>50006</value>
> > > >  </property>
> > > > </configuration>
> > > >
> > > > conf/masters
> > > > localhost
> > > >
> > > > conf/slaves
> > > > anaconda
> > > > v-desktop
> > > >
> > > > the datanode, namenode, secondarynamenode seems to be working fine
> > on
> > > the
> > > > master but on slave this is not the case
> > > >
> > > > slave
> > > > OS: Ubuntu
> > > >
> > > > hadoop-site.xml listing
> > > >
> > > > same as master
> > > >
> > > > in the logs on slave machine I see this
> > > >
> > > > 2008-03-05 12:15:25,705 INFO
> > org.apache.hadoop.metrics.jvm.JvmMetrics:
> > > > Initializing JVM Metrics with processName=DataNode, sessionId=null
> > > > 2008-03-05 12:15:25,920 FATAL org.apache.hadoop.dfs.DataNode:
> > > Incompatible
> > > > build versions: namenode BV = Unknown; datanode BV = 607333
> > > > 2008-03-05 12:15:25,926 ERROR org.apache.hadoop.dfs.DataNode:
> > > > java.io.IOException: Incompatible build versions: namenode BV =
> > Unknown;
> > > > datanode BV = 607333
> > > >        at org.apache.hadoop.dfs.DataNode.handshake(DataNode.java
> > :316)
> > > >        at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java
> > > :238)
> > > >        at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:206)
> > > >        at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java
> > > :1575)
> > > >        at org.apache.hadoop.dfs.DataNode.run(DataNode.java:1519)
> > > >        at org.apache.hadoop.dfs.DataNode.createDataNode(
> > DataNode.java
> > > > :1540)
> > > >        at org.apache.hadoop.dfs.DataNode.main(DataNode.java:1711)
> > > >
> > > > Can someone help me with this please.
> > > >
> > > > Thanks
> > > >
> > > > Ved
> > > >
> > >
> >
> >
> >
> > --
> > The University of Edinburgh is a charitable body, registered in
> > Scotland,
> > with registration number SC005336.
> >
>
>

Reply via email to