Dear all, I'm currently working with hadoop and paradigm map/reduce and I'm having some problems working with the plugin of eclipse.
The SW versions which I'm working are: .jdk : jdk1.6.0_16 .eclipse: Galileo 3.5, I also have tried with ganymede and I have had the same problem. .hadoop-0.19.1-eclipse-plugin.jar, the plugin for eclipse. >From the terminal I can't work perfectly with the version hadoop-0.18.3 and I have launched successfully the example of word count. But I don't know if both of them (hadoop terminal and hadoop eclipse) must be of the same number version. My hadoop-site.xml looks: <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-${user.name}</value> </property> <property> <name>fs.default.name</name> <value>hdfs://localhost:54310</value> </property> <property> <name>mapred.job.tracker</name> <value>localhost:54311</value> </property> <property> <name>dfs.replication</name> <value>8</value> </property> <property> <name>mapred.child.java.opts</name> <value>-Xmx512m</value> </property> </configuration> In the tag of map/reduce locations of the plugin of eclipse when I configure the ports number 54310 and 54311 I have an exception which says : Error: call to localhost/127.0.0.1:54311 failed on connection exception: java...connection refused If I try to put other ports number like 9001 and 9000 the error is diferent and it says: Error: DFS browser expects a DIstributedFileSystem please any help will be wellcome. Joan