[ 
https://issues.apache.org/jira/browse/PHOENIX-5103?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16991955#comment-16991955
 ] 

Miles Yao commented on PHOENIX-5103:
------------------------------------

Hi -

We ran into this issue in our *Spark2* application running on our own Phoenix 
4.14.0-cdh5.12.2, modified to provide a Cloudera Spark 2.4-compiled 
phoenix-spark2 client plugin.  Source code is wholesale lifted from 
phoenix-spark - the only change is manual override of <spark.version> and 
<scala.version> in pom.xml.  This Phoenix build had been working fine with 
earlier Cloudera Spark2 v2.2.0.cloudera1.  No 4.15 code at all.

Stack trace is a bit different:

User class threw exception: org.apache.phoenix.schema.TableNotFoundException: 
ERROR 1012 (42M03): Table undefined. tableName=SYSTEM.CHILD_LINK
at 
org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:604)
at 
org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:395)
at 
org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:387)
at 
org.apache.phoenix.compile.FromCompiler.getResolverForMutation(FromCompiler.java:304)
at org.apache.phoenix.compile.UpsertCompiler.compile(UpsertCompiler.java:352)
at 
org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:784)
at 
org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:770)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:401)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:391)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:390)
at 
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:378)
at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1825)
at org.apache.phoenix.util.UpgradeUtil.moveChildLinks(UpgradeUtil.java:1166)
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.upgradeSystemCatalogIfRequired(ConnectionQueryServicesImpl.java:3008)
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.upgradeSystemTables(ConnectionQueryServicesImpl.java:3085)
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2608)
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2515)
at 
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
at 
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2515)
at 
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
at 
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:113)
at 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(ConnectionUtil.java:58)
at 
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:354)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:118)
at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:60)
at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:403)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)

 

I doubt we are the only one attempting this, as legacy Spark 1 is now being 
replaced across the board with Spark2.  Anyone else run into such issue?

 

Thanks,

Miles

 

> Can't create/drop table using 4.14 client against 4.15 server
> -------------------------------------------------------------
>
>                 Key: PHOENIX-5103
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-5103
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.15.0
>            Reporter: Vincent Poon
>            Assignee: Chinmay Kulkarni
>            Priority: Blocker
>             Fix For: 4.15.0, 5.1.0
>
>         Attachments: PHOENIX-5103-4.x-HBase-1.3.patch, 
> PHOENIX-5103-4.x-HBase-1.3_addendum.patch, PHOENIX-5103-master.patch
>
>          Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> server is running 4.15 commit e3280f
> Connect with 4.14.1 client.  Create table gives this:
> Caused by: 
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.TableNotFoundException):
>  org.apache.hadoop.hbase.TableNotFoundException: Table 'SYSTEM:CHILD_LINK' 
> was not found, got: SYSTEM:CATALOG.
>       at 
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1362)
>       at 
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1230)
>       at 
> org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to