The issue would definitely lie with your CLASSPATH. Ideally, while beginning development using Hadoop 0.20, it is better to use the `hadoop jar` command to launch jars of any kind that require Hadoop libraries; be it MapReduce or not. The command will ensure that all the classpath requirements for Hadoop-side libraries are satisfied, so you don't have to worry.
Anyhow, try launching it this way: $ java -classpath hadoop-0.20.2-core.jar -jar HadoopHdfsHello.jar; # This should run just fine. On Mon, Jan 24, 2011 at 5:06 PM, Alessandro Binhara <[email protected]> wrote: > Hello .. > > i solve problem in jar.. > i put a hadoop-core-0.20.2.jar in same jar dir. > > i configure a class path > export CLASSPATH=.:$JAVA_HOME > > i got this erro in shell > > root:~# java -jar HahoopHdfsHello.jar > Exception in thread "main" java.lang.NoClassDefFoundError: > org/apache/hadoop/conf/Configuration > at HadooHdfsHello.main(HadooHdfsHello.java:18) > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.conf.Configuration > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > ... 1 more > > > What is the problem? > > thanks > -- Harsh J www.harshj.com
