Thanks for the hint.. It was actually hadoop.tmp.dir whose perms needed to be 
relaxed.

It would have been nice if the error msg were more helpful (more stack trace or 
the name of
file/dir which caused exception). Perhaps I will find out once I start digging 
deeper into code.

-Vinay




________________________________
From: Zheng Shao <[email protected]>
To: [email protected]
Sent: Thursday, August 20, 2009 10:51:53 PM
Subject: Re: File create perm. denied on starting hive

Can you check the "mapred.tmp.dir" in your hadoop-site.xml/hadoop-default.xml?
It seems that Hadoop is unable to write to that directory.

Zheng

On Thu, Aug 20, 2009 at 12:30 PM, vinay gupta<[email protected]> wrote:
> Hello hive users,
> I am getting the following exception while running bin/hive
> java.io.IOException: Permission denied
> at java.io.UnixFileSystem.createFileExclusively(Native Method)
> at java.io.File.checkAndCreate(File.java:1704)
> at java.io.File.createTempFile(File.java:1793)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:116)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
>
> Anyone else seen this??
>
> I built hive with hadoop 0.19.1 and running with java 6.
> Also anyone know how I can get more info on the stack trace.
> Is there hive debug mode which prints more diagnostics??
> Please let me know. I just installed hive and trying to set it up.
> Thanks
> Vinay
>



-- 
Yours,
Zheng



      

Reply via email to