Hey, This was indeed the case (javac was wrong version), but now i ran into another problem.
When running the program with the line: bin/hadoop jar usr/joe/wordcount.jar org.myorg.WordCount usr/joe/wordcount/input usr/joe/wordcount/output the result is: Exception in thread "main" java.lang.ClassNotFoundException: org.myorg.WordCount at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:303) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:316) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at org.apache.hadoop.util.RunJar.main(RunJar.java:149) I'm guessing this is a classpath issue, so I tried adding the path where the .jar file is to the system env classpath variable, but this didn't help. I also overheard that there should be a temporary classes dir in hadoop home, i created this and added to classpath, but this didn't work either. Anyone seen something similar / possible fixes? Thanks again - Mikko ----- Original Message ----- From: Todd Lipcon To: common-user@hadoop.apache.org ; Mikko Lahti Sent: Wednesday, December 02, 2009 9:03 PM Subject: Re: Trouble with tutorial Make sure you're using Java6. Hadoop is compiled with Java6, and it smells like you're using an earlier JDK. -Todd On Wed, Dec 2, 2009 at 11:01 AM, Mikko Lahti <mikko.la...@pp1.inet.fi> wrote: Hi, I'm trying to run the map/reduce tutorials with windows XP and cygwin. In the 2nd step: javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java , I get the following error: WordCount.java:6: cannot access org.apache.hadoop.fs.Path bad class file: hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class) class file has wrong version 50.0, should be 49.0 Please remove or make sure it appears in the correct subdirectory of the classpath. import org.apache.hadoop.fs.Path; ^ 1 error I'm using the command: javac -classpath hadoop-0.19.2-core.jar -d wordcount_classes Wordcount.java This is done at the root directory of Hadoop. I also used the command with -verbose, this gets: [parsing started WordCount.java] [parsing completed 47ms] [search path for source files: [hadoop-0.19.2-core.jar]] [search path for class files: [c:\Program Files\Java\jdk1.5.0_09\jre\lib\rt.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\jsse.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\jce.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\charsets.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\dnsns.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\jmf.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\localedata.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\sound.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\sunjce_provider.jar, c:\Program Files\Java\jdk1.5.0_09\jre\lib\ext\sunpkcs11.jar, hadoop-0.19.2-core.jar]] [loading c:\Program Files\Java\jdk1.5.0_09\jre\lib\rt.jar(java/io/IOException.class)] [loading hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class)] WordCount.java:6: cannot access org.apache.hadoop.fs.Path bad class file: hadoop-0.19.2-core.jar(org/apache/hadoop/fs/Path.class) class file has wrong version 50.0, should be 49.0 Please remove or make sure it appears in the correct subdirectory of the classpath. import org.apache.hadoop.fs.Path; ^ [total 453ms] 1 error Running the latest stable 0.20.1 makes no difference. What could be wrong? - Mikko