Thank you.
I tried all the following but none works:
FSDataOutputStream out = hdfs.create(new Path("/user/logger/dev2/"));
FSDataOutputStream out = hdfs.create(new Path("/user/logger/dev2"));
Path hdfsFile = new Path("/user/logger/dev2/one.dat");
FSDataOutputStream out = hdfs.create(hdfsFile);
Path hdfsFile = new Path("/user/logger/dev2");
FSDataOutputStream out = hdfs.create(hdfsFile);
Path hdfsFile = new Path("/user/logger/dev2/");
FSDataOutputStream out = hdfs.create(hdfsFile);
On Tuesday, July 29, 2014 1:57 AM, Wellington Chevreuil
<[email protected]> wrote:
Hum, I'm not sure, but I think through the API, you have to create each folder
level at a time. For instance, if your current path is "/user/logger" and you
want to create "/user/logger/dev2/tmp2", you have to first do hdfs.create(new
Path("/user/logger/dev2")), then hdfs.create(new
Path("/user/logger/dev2/tmp2")). Have you already tried that?
On 29 Jul 2014, at 08:43, R J <[email protected]> wrote:
Hi All,
>
>
>I am trying to programmatically create a directory in HDFS but it fails with
>error.
>
>
>This the part of my code:
>Path hdfsFile = new Path("/user/logger/dev2/tmp2");
>try {
>FSDataOutputStream out = hdfs.create(hdfsFile);
>}
>
>And I get this error:
>java.io.IOException: Mkdirs failed to create /user/logger/dev2/tmp2
> at
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:379)
> at
>org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:365)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:584)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:565)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:472)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:464)
> at PutMerge.main(PutMerge.java:20)
>
>
>I can create the same HDFS directory (and then remove) via hadoop command as
>the same user who is running the java executable:
>$hadoop fs -mkdir /user/logger/dev/tmp2
>$hadoop fs -rmr /user/logger/dev/tmp2
>(above works)
>
>
>Here is my entire code:
>------PutMerge.java------
>import java.io.IOException;
>import org.apache.hadoop.conf.Configuration;
>import
org.apache.hadoop.fs.FSDataInputStream;
>import org.apache.hadoop.fs.FSDataOutputStream;
>import org.apache.hadoop.fs.FileStatus;
>import org.apache.hadoop.fs.FileSystem;
>import org.apache.hadoop.fs.Path;
>public class PutMerge {
>
>public static void main(String[] args) throws IOException {
>Configuration conf = new Configuration();
>FileSystem hdfs = FileSystem.get(conf);
>FileSystem local = FileSystem.getLocal(conf);
>
>Path inputDir = new Path("/home/tmp/test");
>Path hdfsFile = new Path("/user/logger/dev/tmp2");
>
>try {
>FileStatus[] inputFiles = local.listStatus(inputDir);
>FSDataOutputStream out =
hdfs.create(hdfsFile);
>
>for (int i=0; i<inputFiles.length; i++) {
>System.out.println(inputFiles[i].getPath().getName());
>FSDataInputStream in = local.open(inputFiles[i].getPath());
>byte buffer[] = new byte[256];
>int bytesRead = 0;
>while( (bytesRead = in.read(buffer)) > 0) {
>out.write(buffer, 0, bytesRead);
>}
>in.close();
>}
>out.close();
>} catch (IOException e) {
>e.printStackTrace();
>}
>}
>}
>------
>
>
>