Hi Bi,
If you want to do that you need to add the export keyword:
export HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath`
hadoop jar xxx completebulkload xxx xxx
JM
2014-06-10 22:02 GMT-04:00 Bi,hongyu—mike <[email protected]>:
> it works!
> thanks very much !
> I thought it was two commands so i run seperately:
> HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath`
> hadoop jar xxx completebulkload xxx xxx
>
> my bad.
>
> thanks ted!
>
>
> 2014-06-11 9:59 GMT+08:00 Ted Yu <[email protected]>:
>
> > Please take a look at http://hbase.apache.org/book.html#completebulkload
> >
> > Notice the command which starts with:
> >
> > HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath`
> >
> >
> >
> > On Tue, Jun 10, 2014 at 6:41 PM, Bi,hongyu—mike <[email protected]>
> wrote:
> >
> > > Hi ,
> > > I got into trouble in bulkload hfiles to hbase after i enable HBase ACL
> > in
> > > hbase-site.xml
> > > <property>
> > > <name>hbase.rpc.engine</name>
> > > <value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
> > > </property>
> > > <property>
> > > <name>hbase.coprocessor.master.classes</name>
> > >
> > <value>org.apache.hadoop.hbase.security.access.AccessController</value>
> > > </property>
> > > <property>
> > > <name>hbase.coprocessor.region.classes</name>
> > >
> > >
> > >
> >
>
> <value>org.apache.hadoop.hbase.security.token.TokenProvider,org.apache.hadoop.hbase.security.access.AccessController,org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value>
> > > </property>
> > >
> > > I run the command as hdfs( hadoop/hbase super user) on hmaster server
> > > hadoop jar hbase-0.94.15-cdh4.6.0-security.jar completebulkload
> > > hdfs://nameservicexxx:8020/hfiles_2hbase/ hbasetablename
> > >
> > > From the exception it seems mapreduce didn't know the hbase.rpc.engine
> > > and still use the writeableRpcEngine,should i add the hbase.rpc.engine
> to
> > > mapred-site.xml and restart the jobtracker+tasktracker?or other
> solution?
> > > very appreciate for your help:)
> > >
> > >
> > > 14/06/11 09:31:45 INFO
> > client.HConnectionManager$HConnectionImplementation:
> > > getMaster attempt 2 of 14 failed; retrying after sleep of 1002
> > > java.io.IOException: Call to xxx/xxx:60000 failed on local exception:
> > > java.io.EOFException
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> > > at
> > > org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> > > at $Proxy7.getProtocolVersion(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:141)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:813)
> > > at
> > > org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:127)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.<init>(LoadIncrementalHFiles.java:114)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:792)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > at
> > >
> > >
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
> > > at
> > > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > at
> org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:51)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> > > Caused by: java.io.EOFException
> > > at java.io.DataInputStream.readInt(DataInputStream.java:375)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> > >
> >
>