Hadoop-hdfs will contain the HDFS (Hadoop FileSystem) classes.
In your code you point to hdfs:// in your path, therefore you use HDFS, hence 
they are needed.

Add the hdfs jar to your classpath and see if that improves things.

Iif you organize imports on you class in an IDE such as Eclipse you'll see that 
when you restrict the org.apache.hadoop.* import only to packages you need, 
that indeed you are using hdfs classes.

Thanks,

Joep

From: Arko Provo Mukherjee [mailto:arkoprovomukher...@gmail.com]
Sent: Wednesday, August 31, 2011 5:40 PM
To: mapreduce-user@hadoop.apache.org
Subject: Re: Compiling programs with Hadoop 0.21.0

Hi,

Thanks for the kind reply.

If only the common and mapred are used, then what is hadoop-hdfs-0.21.0.jar 
used for?

My code looks like this:


import java.util.*;

import java.lang.*;

import java.io.*;

import org.apache.hadoop.*;



class MapReduce  {



public static void main(String[] args) throws Exception  {

        try  {

                Path pt=new 
Path("hdfs://localhost:54310//Users/arko/Documents/Research/HDFS/abc");

                FileSystem fs = FileSystem.get(new Configuration());

                BufferedWriter br=new BufferedWriter(new 
OutputStreamWriter(fs.create(pt,true)));

                String line;

                line="Testing";

                System.out.println(line);

                br.write(line);

                br.close();

        }catch(Exception e){

                System.out.println("File not found");

        }

    }

}

When I compile the code, I get the following erros:
$javac -classpath /Users/arko/Documents/hadoop-0.21.0/hadoop-common-0.21.0.jar 
-d class/ FileSystemCat.java clear

FileTest.java:16: cannot find symbol
symbol  : class Path
location: class FileTest
                Path pt=new 
Path("hdfs://localhost:54310//Users/arko/Documents/Research/HDFS/abc");
                ^
FileTest.java:16: cannot find symbol
symbol  : class Path
location: class FileTest
                Path pt=new 
Path("hdfs://localhost:54310//Users/arko/Documents/Research/HDFS/abc");
                            ^
FileTest.java:17: java.io.FileSystem is not public in java.io<http://java.io>; 
cannot be accessed from outside package
                FileSystem fs = FileSystem.get(new Configuration());
                ^
FileTest.java:17: cannot find symbol
symbol  : class Configuration
location: class FileTest
                FileSystem fs = FileSystem.get(new Configuration());
                                                   ^
FileTest.java:17: java.io.FileSystem is not public in java.io<http://java.io>; 
cannot be accessed from outside package
                FileSystem fs = FileSystem.get(new Configuration());
                                ^
5 errors

Thanks again for help!
Warm Regards
Arko
On Wed, Aug 31, 2011 at 5:00 PM, Robert Evans 
<ev...@yahoo-inc.com<mailto:ev...@yahoo-inc.com>> wrote:
You should be able to use hadoop-common-0.21.0.jar for accessing HDFS APIS and 
use hadoop-mapred.0.21.0.jar for accessing the mapreduce APIs.  I cannot really 
comment further on compilation errors without seeing the code/error messages.

--Bobby Evans


On 8/31/11 4:34 PM, "Arko Provo Mukherjee" 
<arkoprovomukher...@gmail.com<http://arkoprovomukher...@gmail.com>> wrote:
Hello,

I am trying to learn Hadoop and doing a project on it.

I need to update some files in my project and hence wanted to use version 0.21.0

However, I am confused as to how I can compile my programs on version 0.21.0 as 
it doesn't have any hadoop-core-0.21.0.jar file. What option should I have to 
give in the -classpath option?
I can see three different JAR files namely hadoop-common-0.21.0.jar, 
hadoop-hdfs-0.21.0.jar & hadoop-mapred-0.21.0.jar.

I am getting compilation errors which using the the common jar file.

Also I would like to learn the FileContext Library as the docs that I have read 
says that it has a simpler interface than the FileSystem. However, I cannot 
find any links to some example code which can help me to grasp the API. Request 
you to kindly share any link / code snippet to demonstrate the use of the new 
API.

Many thanks in advance for your kind response.

Warm regards
Arko

Reply via email to