The problem is fixed after chmod 777 to /user directory. Thanks.

Best Regards,

J.P

在 2011年6月25日 下午12:26,俊平堵 <[email protected]>写道:

> Hello Harsh,
>          Thanks for your suggestion here. I modified the /tmp to 777, and
> something different error happens there:
>
>  11/06/26 11:13:42 INFO mapred.FileInputFormat: creating control file: 50
> mega bytes, 2 files
> org.apache.hadoop.security.AccessControlException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hadoop2, access=WRITE, inode="TestDFSIO":hadoop:supergroup:rwxr-xr-x
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
>         at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
>         at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:584)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:227)
>         at
> org.apache.hadoop.fs.TestDFSIO.createControlFile(TestDFSIO.java:114)
>         at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:351)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>         at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>         at org.apache.hadoop.test.AllTestDriver.main(AllTestDriver.java:81)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hadoop2, access=WRITE, inode="TestDFSIO":hadoop:supergroup:rwxr-xr-x
>         at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:176)
>         at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(PermissionChecker.java:157)
>         at
> org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(PermissionChecker.java:108)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4514)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:1702)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1680)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:517)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:740)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>         at $Proxy0.delete(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>         at $Proxy0.delete(Unknown Source)
>         at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:582)
>         ... 15 more
>
> It looks HDFS permission issues, what should I do to get rid of this?
>
> Thanks,
>
> J.P
>
> 在 2011年6月24日 下午2:04,Harsh Chouraria <[email protected]>写道:
>
>  Your /tmp is probably not writable by your current user (ensure /tmp is
>> 777 perhaps). RunJar, which prepares a complete jar to submit to Hadoop,
>> will require use of /tmp.
>>
>> P.s., Please reply to and keep [email protected] in CC as
>> well, so others can keep track of the conversation and reply back to you in
>> case they have suggestions!
>>
>>
>>  On 24-Jun-2011, at 11:19 AM, 俊平堵 wrote:
>>
>> Thanks for the reply. Yes. I think I cannot submit it as I try a test job
>> as following:
>> bin/hadoop jar hadoop-0.20.2-test.jar TestDFSIO -write -nrFiles 2
>> -fileSize 100
>> The error I met is:
>> Exception in thread "main" java.io.IOException: Permission denied
>>          at java.io.UnixFileSystem.createFileExclusively(Native Method)
>>         at java.io.File.checkAndCreate(File.java:1704)
>>         at java.io.File.createTempFile(File.java:1792)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
>>
>> My hadoop distribution is 0.20.2.
>>
>> Thanks,
>>
>> Junping
>>
>> 2011/6/24 Harsh J <[email protected]>
>>
>>> (-general@, +common-user@ -- Please use general@ only for project wide
>>> discussions)
>>>
>>> User jobs do not need visibility of the java processes to submit jobs.
>>>
>>> Specifically, are you facing any issues trying to run a job as another
>>> user?
>>>
>>> On Fri, Jun 24, 2011 at 5:45 AM, 俊平堵 <[email protected]> wrote:
>>> > Hello all,
>>> >         I setup a hadoop cluster (0.20.2 distribution), and start it
>>> with
>>> > one user on master node. When I switch to another user, I even cannot
>>> see
>>> > any processes for hadoop (use jps command) which means I cannot submit
>>> job
>>> > into this cluster user another user account. I see a lot of articles
>>> saying
>>> > that hadoop support multi-tenancy, can anyone tell me how to configure
>>> it?
>>> > Thanks a lot!
>>> >
>>> > Best Regards,
>>> >
>>> > J.P
>>> >
>>>
>>>
>>>
>>> --
>>> Harsh J
>>>
>>
>>
>>
>

Reply via email to