Hi, James
Cannot agree more for being a CDH issue here. Cause phoenix has been included
in the Cloudera labs as a beta version,
many users are pleased to use latest 4.4 phoenix with their CDH environment. So
there can be some leverage here : )
Actually we only changed 2 related source codes. And tends out to be some hbase
jar incompatibility between CDH hbase
and Apache HBase. The details are listed below :
1
phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/IndexSplitTransaction.java#L289
MetaTableAccessor.splitRegion : The native Apache HBase
source code here takes 5 params while the CDH hbase
version takes 6 params with a replication INT parameter additionally. We
modified the above code passing the method with
a default value 0.
2
phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/LocalIndexMerger.java#L84
RegionMergeTransactionImpl.prepareMutationsForMerge : Also
the method got wrong parameters without the replication
parameter. And we also added as a default value 0.
We happened to see that the master branch has got the above 2 parameters
rightly in specific source code. Maybe we could use that
and recompiled phoenix again. Before that, hope the team would provide some
guidance and help.
Best regards,
Sun.
CertusNet
From: James Taylor
Date: 2015-06-03 22:18
To: [email protected]
CC: dev
Subject: Re: Phoenix 4.4 do not work with CDH 5.4
I'd look toward Cloudera to help with this. Phoenix has no control over what's
in the CDH distro, so ensuring compatibility isn't really feasible. I'd
encourage any and all Phoenix users running on CDH to let Cloudera know that
you'd like to see Phoenix added to their distro. They've started down that path
by including Phoenix in their Cloudera labs, which is a good first step.
In the meantime, what kind of source changes did you need to make, Fulin?
Perhaps we can merge those into Phoenix if they're not too invasive and don't
break b/w compatibility?
Thanks,
James
On Monday, June 1, 2015, Fulin Sun <[email protected]> wrote:
Hi,
We had been encountered exact issue here and got finally resolved through
modifying phoenix-core
source code and recompiled it specifying the CDH 5.4 version. However, this
would not be a right direction.
As the latest CDH version had promoted to integrate with hbase 1.0.0 and many
hbase users are considering
and using CDH platform, really hope the Phoenix team would help get an
appropriate solution for this.
Thanks,
Sun.
CertusNet
发件人: wangkun
发送时间: 2015-06-01 16:31
收件人: user
主题: Re: Phoenix 4.4 do not work with CDH 5.4
Hi Yuhao
Thank you for your suggestion about cloudera-labs-phoenix. It will be useful
for me in the future, but now I found it still not support phoenix 4.4 which
can work with spark. And I want to do some test on phoenix spark integration.
Do you combine phoenix 4.4 with CDH 5.4 successfully? I will be appreciated if
you can share your experience on that.
Thanks
Kevin Wang
在 2015年5月29日,下午4:05,Yuhao Bi <[email protected]> 写道:
Hi there,
I have similar experience to you.
I tried to combine Phoenix 4.4 RC with my CDH5.4.0 cluster, you have to modify
a few source code.
But, I suggest you to use cloudera-labs-phoenix which is compatible and easy to
deploy for CDH.
http://blog.cloudera.com/blog/2015/05/apache-phoenix-joins-cloudera-labs/
source code is here: https://github.com/cloudera-labs/phoenix
2015-05-29 12:08 GMT+08:00 wangkun <[email protected]>:
Hi, All
I am using CDH5.4.0 (which using HBase-1.0.0) with phoenix-4.4.0-HBase-1.0. I
copied the phoenix-4.4.0-HBase-1.0-server.jar to HBase lib directory and
restart HBase successfully.
I run the sqlline.py to access it and got the following exception.
[yimr@yi07 bin]$ ./sqlline.py localhost
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost none none
org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:localhost
15/05/29 11:00:32 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/05/29 11:00:32 INFO metrics.Metrics: Initializing metrics system: phoenix
15/05/29 11:00:32 INFO impl.MetricsConfig: loaded properties from
hadoop-metrics2-phoenix.properties
15/05/29 11:00:32 INFO trace.PhoenixMetricsSink: Writing tracing metrics to
phoenix table
15/05/29 11:00:32 INFO trace.PhoenixMetricsSink: Phoenix tracing writer started
15/05/29 11:00:32 INFO impl.MetricsSinkAdapter: Sink tracing started
15/05/29 11:00:32 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10
second(s).
15/05/29 11:00:32 INFO impl.MetricsSystemImpl: phoenix metrics system started
15/05/29 11:00:33 INFO query.ConnectionQueryServicesImpl: Found quorum:
localhost:2181
15/05/29 11:00:33 INFO client.ConnectionManager$HConnectionImplementation:
Closing master protocol: MasterService
15/05/29 11:00:33 INFO client.ConnectionManager$HConnectionImplementation:
Closing zookeeper sessionid=0x14d9a9d103c3d4f
15/05/29 11:00:33 WARN ipc.CoprocessorRpcChannel: Call failed on IOException
org.apache.hadoop.hbase.DoNotRetryIOException:
org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1148)
at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:10515)
at
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1741)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1723)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31447)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:925)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1001)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1097)
... 10 more
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:313)
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1609)
at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92)
at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95)
at
org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.createTable(MetaDataProtos.java:10695)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1261)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1250)
at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1737)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by:
org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1148)
at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:10515)
at
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1741)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1723)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31447)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:925)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1001)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1097)
... 10 more
at
org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199)
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:31913)
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1605)
... 13 more
15/05/29 11:00:33 WARN client.HTable: Error calling coprocessor service
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService for row
\x00SYSTEM\x00CATALOG
java.util.concurrent.ExecutionException:
org.apache.hadoop.hbase.DoNotRetryIOException:
org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1148)
at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:10515)
at
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1741)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1723)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31447)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:925)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1001)
at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1097)
... 10 more
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at
org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1749)
at
org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1705)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1024)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:1004)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1249)
at
org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:112)
at
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1902)
at
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:744)
at
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:303)
at
org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:295)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:293)
at
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1236)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1891)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1860)
at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1860)
at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:162)
at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:131)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:133)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:157)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:203)
at sqlline.Commands.connect(Commands.java:1064)
at sqlline.Commands.connect(Commands.java:996)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
at sqlline.SqlLine.dispatch(SqlLine.java:804)
at sqlline.SqlLine.initArgs(SqlLine.java:588)
at sqlline.SqlLine.begin(SqlLine.java:656)
at sqlline.SqlLine.start(SqlLine.java:398)
at sqlline.SqlLine.main(SqlLine.java:292)
Any suggestion in helping resolve the problem is greatly appreciated.
Thanks
Kevin Wang