Yudou,

You can surely use java -jar to run your programs. You can consider two ways:

1. Place/pack dependent libs inside your jar, added to its classpath
locations internally (via the manifest -- IDEs/build programs help you
do this). Thus you do not need to specify a runtime classpath or
additional jars, and can just execute via java -jar, which will load
all classes from within the jar itself. This is what I see most do,
when writing apps.

2. Run with all jars on classpath, and main classname provided.
    $ java -classpath dependent:jars:myjar.jar org.mypackage.MainClassName

On Wed, Nov 2, 2011 at 2:42 AM, Yuduo Zhou <yuduoz...@gmail.com> wrote:
> Hi all,
>
> I assume usually people will execute Hadoop/HDFS jobs using something like
> "./hadoop jar myjar.jar". I'm wondering is there anyway to bypass "./hadoop"
> part, just using "java" or "java -jar"? I'm only using APIs from HDFS,
> nothing from Hadoop MapReduce.
>
> I have a simple program here, just write one sentence into HDFS and then
> read the content and print to screen.
>
> import java.io.*;
> import java.util.ArrayList;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FSDataInputStream;
> import org.apache.hadoop.fs.FSDataOutputStream;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
>
> public class HelloHDFS {
>     public static void main (String [] args) throws IOException {
>         String filename = args[0];
>         Path file = new Path(filename);
>         FileSystem hdfs = FileSystem.get(new Configuration());
>         FSDataOutputStream out = hdfs.create(file);
>         out.writeUTF("Foo foo foo!");
>         out.flush();
>         out.close();
>         FSDataInputStream in = hdfs.open(file);
>         int bytes;
>         byte[] buffer = new byte[512];
>         while ((bytes = in.read(buffer)) > 0) {
>                 System.out.write(buffer, 0, bytes);
>         }
>         in.close();
>     }
> }
>
> I compress this into a jar, and it can run successfully using "hadoop -jar
> myjar test". However, when I try to use "java -jar " or "java [classname]",
> it throughs NoClassDefFoundError like:
>
> $ java -jar bin/hdfsUtil.jar test
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/fs/Path
>         at hdfs.HelloHDFS.main(Unknown Source)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Path
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>         ... 1 more
>
> I have hadoop-core-0.20.203.0.jar and commons-logging-1.1.1.jar in my
> CLASSPATH and I'm sure they are undamaged, but I still have these no class
> found exceptions. Do I need to include other libraries to solve it?
>
> Thanks in advance,
> Yuduo Zhou
>
>



-- 
Harsh J

Reply via email to