[
https://issues.apache.org/jira/browse/PHOENIX-3460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15644770#comment-15644770
]
Xindian Long commented on PHOENIX-3460:
---------------------------------------
Exception log:
16/11/03 16:32:25 INFO ZooKeeper: Initiating client connection,
connectString=luna-sdp-nms-01.davis.sensus.lab:2181 sessionTimeout=90000
watcher=org.apache.hadoop.hbase.zookeeper.PendingWatcher@27898e13
16/11/03 16:32:25 INFO ClientCnxn: Opening socket connection to server
10.22.13.19/10.22.13.19:2181. Will not attempt to authenticate using SASL
(unknown error)
16/11/03 16:32:25 INFO ClientCnxn: Socket connection established to
10.22.13.19/10.22.13.19:2181, initiating session
16/11/03 16:32:25 INFO ClientCnxn: Session establishment complete on server
10.22.13.19/10.22.13.19:2181, sessionid = 0x1582610cca900a6, negotiated timeout
= 40000
16/11/03 16:32:25 INFO Metrics: Initializing metrics system: phoenix
16/11/03 16:32:25 WARN MetricsConfig: Cannot locate configuration: tried
hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
16/11/03 16:32:25 INFO MetricsSystemImpl: Scheduled snapshot period at 10
second(s).
16/11/03 16:32:25 INFO MetricsSystemImpl: phoenix metrics system started
16/11/03 16:32:26 ERROR Application: sql error:
org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table
undefined. tableName=ACME:ENDPOINT_STATUS
at
org.apache.phoenix.schema.PMetaDataImpl.getTableRef(PMetaDataImpl.java:265)
at
org.apache.phoenix.jdbc.PhoenixConnection.getTable(PhoenixConnection.java:449)
at
org.apache.phoenix.util.PhoenixRuntime.getTable(PhoenixRuntime.java:407)
at
org.apache.phoenix.util.PhoenixRuntime.generateColumnInfo(PhoenixRuntime.java:433)
at
org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.getSelectColumnMetadataList(PhoenixConfigurationUtil.java:279)
at org.apache.phoenix.spark.PhoenixRDD.toDataFrame(PhoenixRDD.scala:106)
at
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:57)
at
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:37)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:150)
at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 ERROR Application: dataframe error:
java.lang.NullPointerException
at
com.sensus.NMSEngineOnHadoop.Application.testSpark(Application.java:157)
at com.sensus.NMSEngineOnHadoop.Application.main(Application.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/03 16:32:26 INFO SparkContext: Invoking stop() from shutdown hook
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static/sql,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/execution,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/SQL,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}
16/11/03 16:32:26 INFO ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}
16/11/03 16:32:26 INFO SparkUI: Stopped Spark web UI at
http://192.168.100.10:4040
16/11/03 16:32:26 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
16/11/03 16:32:26 INFO MemoryStore: MemoryStore cleared
16/11/03 16:32:26 INFO BlockManager: BlockManager stopped
16/11/03 16:32:26 INFO BlockManagerMaster: BlockManagerMaster stopped
16/11/03 16:32:26 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
16/11/03 16:32:26 INFO SparkContext: Successfully stopped SparkContext
16/11/03 16:32:26 INFO ShutdownHookManager: Shutdown hook called
16/11/03 16:32:26 INFO ShutdownHookManager: Deleting directory
/tmp/spark-6121baef-3d66-473e-8799-6733fb414ddd
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down
remote daemon.
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon
shut down; proceeding with flushing remote transports.
16/11/03 16:32:26 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut
down.
16/11/03 16:32:27 INFO ShutdownHookManager: Deleting directory
/tmp/spark-6121baef-3d66-473e-8799-6733fb414ddd/httpd-0cfae97e-687e-47ce-af4e-301b0900a4c8
> Phoenix Spark plugin cannot find table with a Namespace prefix
> --------------------------------------------------------------
>
> Key: PHOENIX-3460
> URL: https://issues.apache.org/jira/browse/PHOENIX-3460
> Project: Phoenix
> Issue Type: Bug
> Affects Versions: 4.8.0
> Environment: HDP 2.5
> Reporter: Xindian Long
> Labels: phoenix, spark
> Fix For: 4.7.0
>
>
> I am testing some code using Phoenix Spark plug in to read a Phoenix table
> with a namespace prefix in the table name (the table is created as a phoenix
> table not a hbase table), but it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql
> through Squirrel. In addition, using spark sql to query it has no problem at
> all.
> I am running on the HDP 2.5 platform, with phoenix 4.7.0.2.5.0.0-1245
> The problem does not exist at all when I was running the same code on HDP 2.4
> cluster, with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace
> prefix in the DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives
> the above exception, but if I run the testJdbc first, and followed by
> testSpark, both of them work.
> After changing to create table by using
> create table ACME.ENDPOINT_STATUS
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table
> name ENDPOINT_STATUS, and the later on shows as scheme none, and table name
> ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition,
> upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is
> the other way around.
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)