Thanks Ted

hbase-1.2.3 worked!



Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 20 October 2016 at 17:09, Ted Yu <[email protected]> wrote:

> I downloaded hive 2.0.1 source tar ball.
>
> In their pom.xml :
>
>     <hbase.version>1.1.1</hbase.version>
>
> Can you run against 1.1.1 or newer hbase release ?
>
> On Thu, Oct 20, 2016 at 8:58 AM, Mich Talebzadeh <
> [email protected]>
> wrote:
>
> > Hive 2.0.1
> > Hbase 0.98
> >
> > hive> select max(price) from test.marketdatahbase;
> >
> > Throws:
> >
> > Caused by: java.lang.NoSuchMethodError:
> > org.apache.hadoop.hbase.protobuf.generated.ClientProtos$
> >
> >
> > I have both hbase-protocol-0.98.21-hadoop2.jar and
> protobuf-java-2.5.0.jar
> > in $HBASE_HOME/lib ditectory
> >
> > Full error as below
> >
> > Query ID = hduser_20161020164447_d283db5c-056d-4d40-8998-d2cca1e63f12
> > Total jobs = 1
> > Launching Job 1 out of 1
> > Number of reduce tasks determined at compile time: 1
> > In order to change the average load for a reducer (in bytes):
> >   set hive.exec.reducers.bytes.per.reducer=<number>
> > In order to limit the maximum number of reducers:
> >   set hive.exec.reducers.max=<number>
> > In order to set a constant number of reducers:
> >   set mapreduce.job.reduces=<number>
> > Starting Job = job_1476869096162_0503, Tracking URL =
> > http://rhes564:8088/proxy/application_1476869096162_0503/
> > Kill Command = /home/hduser/hadoop-2.7.3/bin/hadoop job  -kill
> > job_1476869096162_0503
> > Hadoop job information for Stage-1: number of mappers: 2; number of
> > reducers: 1
> > 2016-10-20 16:45:01,146 Stage-1 map = 0%,  reduce = 0%
> > 2016-10-20 16:45:39,143 Stage-1 map = 100%,  reduce = 100%
> > Ended Job = job_1476869096162_0503 with errors
> > Error during job, obtaining debugging information...
> > Examining task ID: task_1476869096162_0503_m_000000 (and more) from job
> > job_1476869096162_0503
> > Task with the most failures(4):
> > -----
> > Task ID:
> >   task_1476869096162_0503_m_000000
> > URL:
> >
> > http://rhes564:8088/taskdetails.jsp?jobid=job_
> > 1476869096162_0503&tipid=task_1476869096162_0503_m_000000
> > -----
> > Diagnostic Messages for this Task:
> > Error: java.io.IOException: java.io.IOException:
> > java.lang.reflect.InvocationTargetException
> >         at
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.
> > handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> >         at
> > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.
> > handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> >         at
> > org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(
> > HiveInputFormat.java:303)
> >         at
> > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(
> > CombineHiveInputFormat.java:662)
> >         at
> > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<
> > init>(MapTask.java:169)
> >         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.
> java:432)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >         at
> > org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1698)
> >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> > Caused by: java.io.IOException: java.lang.reflect.
> > InvocationTargetException
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:240)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager.createConnection(
> > ConnectionManager.java:420)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager.createConnection(
> > ConnectionManager.java:413)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager.getConnectionInternal(
> > ConnectionManager.java:291)
> >         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:177)
> >         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:157)
> >         at
> > org.apache.hadoop.hive.hbase.HiveHBaseInputFormatUtil.getTable(
> > HiveHBaseInputFormatUtil.java:50)
> >         at
> > org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(
> > HiveHBaseTableInputFormat.java:97)
> >         at
> > org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(
> > HiveInputFormat.java:301)
> >         ... 9 more
> > Caused by: java.lang.reflect.InvocationTargetException
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at
> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:238)
> >         ... 17 more
> > Caused by: java.lang.NoSuchMethodError:
> > org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$Builder.
> > setStale(Z)Lorg/apache/hadoop/hbase/protobuf/generated/
> > ClientProtos$Result$Builder;
> >         at
> > org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>
> > (ProtobufUtil.java:213)
> >         at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.
> java:64)
> >         at
> > org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(
> > ZKClusterId.java:75)
> >         at
> > org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(
> > ZooKeeperRegistry.java:105)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager$
> > HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager$
> > HConnectionImplementation.<init>(ConnectionManager.java:635)
> >         ... 22 more
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > Thanks
> >
> >
> > Dr Mich Talebzadeh
> >
> >
> >
> > LinkedIn * https://www.linkedin.com/profile/view?id=
> > AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> > <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCd
> > OABUrV8Pw>*
> >
> >
> >
> > http://talebzadehmich.wordpress.com
> >
> >
> > *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> > loss, damage or destruction of data or any other property which may arise
> > from relying on this email's technical content is explicitly disclaimed.
> > The author will in no case be liable for any monetary damages arising
> from
> > such loss, damage or destruction.
> >
>

Reply via email to