I think that file names of the form "file://directory-path" should work to
give you local file access using this program.
On 5/6/08 3:34 PM, "Ajey Shah" <[EMAIL PROTECTED]> wrote:
>
> Thanks Suresh. But even this program reads and writes from the HDFS. What i
> need to do is read from my normal local linux harddrive and write to the
> HDFS.
>
> I'm sorry if I misunderstood your program.
>
> Thanks for replying. :)
>
>
>
> Babu, Suresh wrote:
>>
>>
>> Try this program. Modify the HDFS configuration, if it is different from
>> the default.
>>
>> import java.io.File;
>> import java.io.IOException;
>>
>> import org.apache.hadoop.conf.Configuration;
>> import org.apache.hadoop.fs.FileStatus;
>> import org.apache.hadoop.fs.FileSystem;
>> import org.apache.hadoop.fs.FSDataInputStream;
>> import org.apache.hadoop.fs.FSDataOutputStream;
>> import org.apache.hadoop.fs.Path;
>> import org.apache.hadoop.io.IOUtils;
>>
>> public class HadoopDFSFileReadWrite {
>>
>> static void usage () {
>> System.out.println("Usage : HadoopDFSFileReadWrite <inputfile>
>> <output file>");
>> System.exit(1);
>> }
>>
>> static void printAndExit(String str) {
>> System.err.println(str);
>> System.exit(1);
>> }
>>
>> public static void main (String[] argv) throws IOException {
>> Configuration conf = new Configuration();
>> conf.set("fs.default.name", "localhost:9000");
>> FileSystem fs = FileSystem.get(conf);
>>
>> FileStatus[] fileStatus = fs.listStatus(fs.getHomeDirectory());
>> for(FileStatus status : fileStatus) {
>> System.out.println("File: " + status.getPath());
>> }
>>
>> if (argv.length != 2)
>> usage();
>>
>> // HadoopDFS deals with Path
>> Path inFile = new Path(argv[0]);
>> Path outFile = new Path(argv[1]);
>>
>> // Check if input/output are valid
>> if (!fs.exists(inFile))
>> printAndExit("Input file not found");
>> if (!fs.isFile(inFile))
>> printAndExit("Input should be a file");
>> if (fs.exists(outFile))
>> printAndExit("Output already exists");
>>
>> // Read from and write to new file
>> FSDataInputStream in = fs.open(inFile);
>> FSDataOutputStream out = fs.create(outFile);
>> byte buffer[] = new byte[256];
>> try {
>> int bytesRead = 0;
>> while ((bytesRead = in.read(buffer)) > 0) {
>> out.write(buffer, 0, bytesRead);
>> }
>>
>> } catch (IOException e) {
>> System.out.println("Error while copying file");
>> } finally {
>> in.close();
>> out.close();
>> }
>> }
>> }
>>
>> Suresh
>>
>>
>> -----Original Message-----
>> From: Ajey Shah [mailto:[EMAIL PROTECTED]
>> Sent: Thursday, May 01, 2008 3:31 AM
>> To: [email protected]
>> Subject: How do I copy files from my linux file system to HDFS using a
>> java prog?
>>
>>
>> Hello all,
>>
>> I need to copy files from my linux file system to HDFS in a java program
>> and not manually. This is the piece of code that I have.
>>
>> try {
>>
>> FileSystem hdfs = FileSystem.get(new
>> Configuration());
>>
>> LocalFileSystem ls = null;
>>
>> ls = hdfs.getLocal(hdfs.getConf());
>>
>> hdfs.copyFromLocalFile(false, new
>> Path(fileName), new Path(outputFile));
>>
>> } catch (Exception e) {
>> e.printStackTrace();
>> }
>>
>> The problem is that it searches for the input path on the HDFS and not
>> my linux file system.
>>
>> Can someone point out where I may be wrong. I feel it's some
>> configuration issue but have not been able to figure it out.
>>
>> Thanks.
>> --
>> View this message in context:
>> http://www.nabble.com/How-do-I-copy-files-from-my-linux-file-system-to-H
>> DFS-using-a-java-prog--tp16992491p16992491.html
>> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>>
>>
>>