Jon.
Were you running cdh3 with security turned on?
LoadIncrementHFiles.doBulkLoad() seems to split a HFile to pieces at the
same folder then load them. In your case, the mr job created the HFile
under geoff directory where hbase user doesn't have write permission.
You can try to put the HFile to a 777 directory(tmp) and try again.
Thanks,
Mingjie
On 06/29/2011 04:04 PM, Jon Stewart wrote:
Howdy,
I'm generating some HFiles during a MapReduce job using
KeyValueSortReducer and HFileOutputFormat. After the job completed, I
then call LoadIncrementHFiles.doBulkLoad().
On my pseudo-dist setup on my dev machine, this works quickly and
without error. On a small CDH3/EC2 cluster (CDH3 Hadoop& HBase
versions), this generates an AccessControlException. It seems obvious
to me that whatever user is running the job ("geoff", below) does not
have sufficient permissions to write to hbase, or the hbase user does
not have sufficient privileges to read the hfiles. However, I'm
clueless as to how to fix it. Help?
Thanks,
Jon
11/06/29 22:03:41 INFO mapreduce.LoadIncrementalHFiles: Trying to load
hfile=hdfs://machine/toplevel/ev/tmp/1f96122d-215a-4e2b-970f-a7bafa71f718/0/6077274466047126248
first=\x00\x00M\x80N\xCF\xA1\xF6:\xD9\xFB\xF9\xB7l\xCAD\x98\x97R\xE1\x01\xD6\x96K\xD7\xE2\xD39\xBA\x8D\xCE\xB9RVI,\x98\xDD\xF2\xE8\xC1X\xAD\xC9\x88"`t\xA0\x8D\xFE\x80N\x00\x00\x00\x0E
last=\xFF\xFF\xD8\xB2\x9F\xA5dY\xA0\x86Cg\xAD6.t\x86,\xC7\x96\x01!\x96K\xD7\xE2\xD39\xBA\x8D\xCE\xB9RVI,\x98\xDD>\xC8~p\xA9\x85\x06\x1Fu\xA7\x11\xE2\x0D4H\x00\x00\x00\x89
Exception in thread "main"
org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
contact region server domU-12-31-39-16-29-C8.compute-1.internal:60020
for region hash,,1309216307388.e904285780ef557555ed39b7b344aa9b., row
'\x00\x00M\x80N\xCF\xA1\xF6:\xD9\xFB\xF9\xB7l\xCAD\x98\x97R\xE1\x01\xD6\x96K\xD7\xE2\xD39\xBA\x8D\xCE\xB9RVI,\x98\xDD\xF2\xE8\xC1X\xAD\xC9\x88"`t\xA0\x8D\xFE\x80N\x00\x00\x00\x0E',
but failed after 100 attempts.
Exceptions:
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=hbase, access=WRITE, inode="0":geoff:supergroup:rwxr-xr-x
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:626)
at
org.apache.hadoop.hdfs.DistributedFileSystem.rename(DistributedFileSystem.java:237)
at
org.apache.hadoop.hbase.regionserver.StoreFile.rename(StoreFile.java:511)
at
org.apache.hadoop.hbase.regionserver.Store.bulkLoadHFile(Store.java:366)
at
org.apache.hadoop.hbase.regionserver.HRegion.bulkLoadHFile(HRegion.java:2196)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.bulkLoadHFile(HRegionServer.java:2046)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
Caused by: org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=hbase, access=WRITE, inode="0":geoff:supergroup:rwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:203)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:184)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4976)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkParentAccess(FSNamesystem.java:4945)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:1865)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:1840)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.rename(NameNode.java:725)
at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1415)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1411)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1409)
at org.apache.hadoop.ipc.Client.call(Client.java:1104)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at $Proxy6.rename(Unknown Source)
at sun.reflect.GeneratedMethodAccessor13.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at $Proxy6.rename(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.rename(DFSClient.java:624)
... 11 more