Please run it in the same style. The binary 'java' accepts a -cp param too:
java -cp $($HADOOP_HOME/bin/hadoop classpath):. PutMerge On Mon, Jul 28, 2014 at 11:21 AM, R J <[email protected]> wrote: > Thanks a lot! I could compile with the added classpath: > $javac -cp $($HADOOP_HOME/bin/hadoop classpath) PutMerge.java > The above created PutMerge.class file. > Now I try to run: > $java PutMerge > Exception in thread "main" java.lang.NoClassDefFoundError: PutMerge > Caused by: java.lang.ClassNotFoundException: PutMerge > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > Could not find the main class: PutMerge. Program will exit. > > I get the above error. > I tried: > $set CLASSPATH=/usr/lib/hadoop/bin/hadoop > $java PutMerge > > I still get the error. > > > > On Sunday, July 27, 2014 10:16 PM, Harsh J <[email protected]> wrote: > > > The javac program can only find import dependencies referenced in a > program if it is also supplied on the javac classpath. Setting > HADOOP_HOME alone will not magically do this. Have you set an > appropriate classpath? > > Try as below, perhaps: > > javac -cp $($HADOOP_HOME/bin/hadoop classpath) PutMerge.java > > Alternatively, consider using a modern build helper tool such as > Apache Maven for writing java applications, they make your work > easier. > > On Mon, Jul 28, 2014 at 6:16 AM, R J <[email protected]> wrote: >> Hi All, >> >> I am new to programming on hadoop. I tried to compile the following >> program >> (example program from a hadoop book) on my linix server where I have >> Haddop >> installed: >> I get the errors: >> $javac PutMerge.java >> PutMerge.java:2: package org.apache.hadoop.conf does not exist >> import org.apache.hadoop.conf.Configuration; >> ^ >> PutMerge.java:3: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FSDataInputStream; >> ^ >> PutMerge.java:4: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FSDataOutputStream; >> ^ >> PutMerge.java:5: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FileStatus; >> ^ >> PutMerge.java:6: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FileSystem; >> ^ >> PutMerge.java:7: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.Path; >> >> I have $HADOOP_HOME set u: >> $echo $HADOOP_HOME >> /usr/lib/hadoop >> >> Could you please suggest how to compile this program? Thanks a lot. >> >> Shu >> >> >> ====PutMerge.java========= >> import java.io.IOException; >> import org.apache.hadoop.conf.Configuration; >> import org.apache.hadoop.fs.FSDataInputStream; >> import org.apache.hadoop.fs.FSDataOutputStream; >> import org.apache.hadoop.fs.FileStatus; >> import org.apache.hadoop.fs.FileSystem; >> import org.apache.hadoop.fs.Path; >> public class PutMerge { >> >> public static void main(String[] args) throws IOException { >> Configuration conf = new Configuration(); >> FileSystem hdfs = FileSystem.get(conf); >> FileSystem local = FileSystem.getLocal(conf); >> >> Path inputDir = new Path(args[0]); >> Path hdfsFile = new Path(args[1]); >> >> try { >> FileStatus[] inputFiles = local.listStatus(inputDir); >> FSDataOutputStream out = hdfs.create(hdfsFile); >> >> for (int i=0; i<inputFiles.length; i++) { >> System.out.println(inputFiles[i].getPath().getName()); >> FSDataInputStream in = local.open(inputFiles[i].getPath()); >> byte buffer[] = new byte[256]; >> int bytesRead = 0; >> while( (bytesRead = in.read(buffer)) > 0) { >> out.write(buffer, 0, bytesRead); >> } >> in.close(); >> } >> out.close(); >> } catch (IOException e) { >> e.printStackTrace(); > >> } >> } >> } >> ============= >> > > > > -- > Harsh J > > > -- Harsh J
