Not exactly, you will still get the issue on version 0.20.204, you want version 0.20.2

On 7/31/2012 10:19 AM, Videnova, Svetlana wrote:
Thank you for your answers.

That’s mean that I have to download hadoop (0.20.204) and then?
Because for the moment I'm using only mahout libs.

-----Message d'origine-----
De : Julian Ortega [mailto:[email protected]]
Envoyé : mardi 31 juillet 2012 09:47
À : [email protected]
Objet : Re: 回复:mahout lib : permissions

As the previous email says, you must be using Windows, and as far as I 
remember, first you need Cygwin and then to solve the permissions issue you 
need to downgrade to version 0.20.2. I think Mahout 0.7 depends on Hadoop 
0.20.204, so you will get the permissions error on Windows and you will get the 
permissions issue from that version onwards (not sure if it 2.0.0-alpha falls 
into this group). There is a workaround for it, but it is quite convoluted.

On 7/31/2012 9:36 AM, alias wrote:
it seems you run your code on local windows instead of  hadoop cluster?  please 
copy you code and try to run it on hadoop cluseter.






------------------ 原始邮件 ------------------
发件人: "Videnova, Svetlana";
发送时间: 2012年7月31日(星期二) 下午3:27
收件人: "[email protected]";
主题: mahout lib : permissions



Hi mahouters,

I am trying to use the mahout lib with my app java.

But while I try to clusterize calling this:

DocumentProcessor.tokenizeDocuments(new
Path(inputDir),analyzer.getClass().asSubclass(Analyzer.class),
tokenizedPath, conf);

And this:

InputDriver.runJob(new Path(inputDir), tokenizedPath,
"org.apache.mahout.math.RandomAccessSparseVector");


I have this error:

Exception in thread "main" java.io.IOException: Failed to set permissions of 
path: C:\Hadoop\mapred\staging\csi_team370130067\.staging to 0700
        at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:680)
        at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:653)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:483)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
        at 
org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
        at 
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:813)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:807)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
        at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:807)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
        at 
org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:93)
        at main.MainClass.main(MainClass.java:53)

Can somebody please help me solving this error please?

Think green - keep it on the screen.

This e-mail and any attachment is for authorised use by the intended 
recipient(s) only. It may contain proprietary material, confidential 
information and/or be subject to legal privilege. It should not be copied, 
disclosed to, retained or used by, any other party. If you are not an intended 
recipient then please promptly delete this e-mail and any attachment and all 
copies and inform the sender. Thank you.


Think green - keep it on the screen.

This e-mail and any attachment is for authorised use by the intended 
recipient(s) only. It may contain proprietary material, confidential 
information and/or be subject to legal privilege. It should not be copied, 
disclosed to, retained or used by, any other party. If you are not an intended 
recipient then please promptly delete this e-mail and any attachment and all 
copies and inform the sender. Thank you.


Reply via email to