Hi All,

I am trying to execute the example wordcount application on a cluster in my 
University's lab. Since I don't have write access to the /etc/hosts file (and 
the admin won't allow me to add entries for each node in the cluster), I am 
using the IP address of each node in all of Hadoop's configuration files. Now 
copying the input files into HDFS works fine, but when I start the application 
I get this message:
Error initializing attempt_200911102009_0001_m_000002_1:
java.lang.IllegalArgumentException: Wrong FS: 
hdfs://128.226.118.98:54310/var/work/aselvan1/hadoop-tmp/mapred/system/job_200911102009_0001/job.xml,
 expected: hdfs://node22.cs.binghamton.edu:54310
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:327)

I am using release 0.18.3. I tried to find the cause for the error, and I think 
that the Authority check fails (probably due to specifying IP address in 
configuration files). I am stuck here, any help is highly appreciated.

Thank you,
Arun




      

Reply via email to