I don't know if this is useful... but you could try to use this small example:

import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.conf.Configuration;

import java.io.IOException;

public class ReadWrite {
   public static void main(String[] args) throws IOException {
       FileSystem hdfs = FileSystem.get(new Configuration());

       Path path = new Path("/testfile");
//writing
       FSDataOutputStream dos = hdfs.create(path);
       dos.writeUTF("Hello World");
       dos.close();
//reading
       FSDataInputStream dis = hdfs.open(path);
       System.out.println(dis.readUTF());
       dis.close();
hdfs.close();
   }
}



I'm assuming that you have in your classpath the hadoop-site.xml file correctly configured, meaning that the property "fs.default.name" has a valid name node for your cluster, or you could use FileSystem.getLocal() for local file system.

John


Hi,
   Are there any examples using HDFS in java program. My requirement
is simple: read and writer on the HDFS.
   Thank you.







    Regards
     HeQi

Reply via email to