This error (specific to Windows) is caused by optimization introduced around 203 and it is still there in 1.0.0 :(. I don't know how to fix it other than recompile Hadoop common with optimization removed from RawLocalFileSystem.java

/**
* Use the command chmod to set permission.
*/
@Override
public void setPermission(Path p, FsPermission permission
   ) throws IOException {
 execSetPermission(pathToFile(p), permission);
}

Vlad


-----Original Message----- From: shlomi java
Sent: Wednesday, January 11, 2012 6:46 AM
To: [email protected]
Subject: Re: Failed to set permissions of path

(sending email again, because it seems it did not reach forum)

On Wed, Jan 11, 2012 at 12:09 PM, shlomi java <[email protected]> wrote:

hi Hadoops & Nutchs,

I'm trying to run Nutch 1.4 *locally*, on Windows 7, using Hadoop
0.20.203.0.
I run with:
fs.default.name = D:\fs
hadoop.tmp.dir = D:\tmp
dfs.permissions = false
PATH environment variable contains C:\cygwin\bin.

I get the following exception:
Exception in thread "main" java.io.IOException: *Failed to set
permissions of path*:
file:/D:/tmp/mapred/staging/username-835169260/.staging to *0700*
 at org.apache.hadoop.fs.RawLocalFileSystem.*checkReturnValue*
(RawLocalFileSystem.java:525)
at org.apache.hadoop.fs.RawLocalFileSystem.*setPermission*
(RawLocalFileSystem.java:499)
 at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
 at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:797)
 at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:791)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Unknown Source)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
 at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:791)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:765)
 at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1200)
at org.apache.nutch.crawl.Injector.inject(Injector.java:217)
 at org.apache.nutch.crawl.Crawl.run(Crawl.java:127)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.nutch.crawl.Crawl.main(Crawl.java:55)

The call to *rv = f.setReadable(group.implies(FsAction.READ), false);*,
in RawLocalFileSystem.setPermission (*f* is java.io.File), returns false,
and that what causes checkReturnValue to throw the exception.
The above .staging folder DOES get created, only setting the permission
fails.

I also tried Hadoop's hadoop.job.ugi property, giving it different values,
with no success.

I'm posting in both forums, because I don't know where is the problem.

Do you? :-)

10X
ShlomiJ


Reply via email to