This happens to me if I don't do a clean build or there is a jar somewhere on
the path
containing an older hadoop version. Try "ant clean" before building and check
for older releases.
Please let me know if that helps.
You may also run into a problem related to HADOOP-1212 after formatting the
Sorry to bother. I cleared the hadoop directory on each node, then
re-formatted it, the problem is solved.
2008/2/21, Zhu Huijun <[EMAIL PROTECTED]>:
>
> Hi,
>
> I have a problem running Hadoop on our cluster. After I formated the
> namenode by executing "./bin/hadoop namenode -format", and starte
Hi,
I have a problem running Hadoop on our cluster. After I formated the
namenode by executing "./bin/hadoop namenode -format", and started the dfs
by executing "./bin/start-dfs.sh", I can access the webpage of namenode at
http://localhost:50070, but the number of live data nodes is 0. If I run
".