RE: [External] Re: Re: error when using apache-phoenix-4.14.0-HBase-1.2-bin with hbase 1.2.6

2018-08-06 Thread Lu, Wei
As the log infos, you should ‘Set hbase.table.sanity.checks to false at conf or 
table descriptor if you want to bypass sanity checks



From: 倪项菲 [mailto:nixiangfei_...@chinamobile.com]
Sent: Tuesday, August 7, 2018 9:30 AM
To: Jaanai Zhang ; user 
Subject: [External] Re: Re: error when using 
apache-phoenix-4.14.0-HBase-1.2-bin with hbase 1.2.6

Hi Zhang Yun,
how to deploy the Phoenix server?I just have the infomation from phoenix 
website,it doesn't mention the phoenix server

[cid:image001.jpg@01D42E31.A55E6890]


发件人: Jaanai Zhang
时间: 2018/08/07(星期二)09:16
收件人: user;
主题: Re: error when using apache-phoenix-4.14.0-HBase-1.2-bin with hbase 1.2.6
Please ensure your Phoenix server was deployed and had resarted



   Yun Zhang
   Best regards!


2018-08-07 9:10 GMT+08:00 倪项菲 
mailto:nixiangfei_...@chinamobile.com>>:

Hi Experts,
I am using HBase 1.2.6,the cluster is working good with HMaster HA,but when 
we integrate phoenix with hbase,it failed,below are the steps
1,download apache-phoenix-4.14.0-HBase-1.2-bin from 
http://phoenix.apache.org,the copy the tar file to the HMaster and unzip the 
file
2,copy phoenix-core-4.14.0-HBase-1.2.jar 
phoenix-4.14.0-HBase-1.2-server.jar to all HBase nodes including HMaster and 
HRegionServer ,put them to hbasehome/lib,my path is /opt/hbase-1.2.6/lib
3,restart hbase cluster
4,then start to use phoenix,but it return below error:
  [apache@plat-ecloud01-bigdata-journalnode01 bin]$ ./sqlline.py 
plat-ecloud01-bigdata-zk01,plat-ecloud01-bigdata-zk02,plat-ecloud01-bigdata-zk03
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:plat-ecloud01-bigdata-zk01 none none 
org.apache.phoenix.jdbc.PhoenixDriver
Connecting to 
jdbc:phoenix:plat-ecloud01-bigdata-zk01,plat-ecloud01-bigdata-zk02,plat-ecloud01-bigdata-zk03
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/opt/apache-phoenix-4.14.0-HBase-1.2-bin/phoenix-4.14.0-HBase-1.2-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/hadoop-2.7.6/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
18/08/06 18:40:08 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
Error: org.apache.hadoop.hbase.DoNotRetryIOException: Unable to load configured 
region split policy 'org.apache.phoenix.schema.MetaDataSplitPolicy' for table 
'SYSTEM.CATALOG' Set hbase.table.sanity.checks to false at conf or table 
descriptor if you want to bypass sanity checks
at 
org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1754)
at 
org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1615)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1541)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:463)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55682)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745) (state=08000,code=101)
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: Unable to load configured region 
split policy 'org.apache.phoenix.schema.MetaDataSplitPolicy' for table 
'SYSTEM.CATALOG' Set hbase.table.sanity.checks to false at conf or table 
descriptor if you want to bypass sanity checks
at 
org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1754)
at 
org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1615)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1541)
at 
org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:463)
at 
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55682)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)

at 

Re: 答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY column

2018-04-20 Thread Lu Wei
I did some digging, and the reason is because I started PQS using JSON 
serialization, rather than PROTOBUF.

When I switch to PROTOBUF serialization, the 'select  * from testarray' query 
works fine.


There is not type for numbers in Json, so an Json Array[100] is parsed to an 
array containing an integer value. when getting items from the Sql result set, 
there is an convert from 100 (an integer) to long (type defined in table), so a 
conversion exception happened.


I guess we should better use Protobuf rather than Json as serialization for PQS.


From: sergey.solda...@gmail.com <sergey.solda...@gmail.com> on behalf of Sergey 
Soldatov <sergeysolda...@gmail.com>
Sent: Friday, April 20, 2018 5:22:47 AM
To: user@phoenix.apache.org
Subject: Re: 答复: phoenix query server java.lang.ClassCastException for BIGINT 
ARRAY column

Definitely, someone who is maintaining CDH branch should take a look. I don't 
observer that behavior on the master branch:

0: jdbc:phoenix:thin:url=http://localhost:876> create table if not exists 
testarray(id bigint not null, events bigint array constraint pk primary key 
(id));
No rows affected (2.4 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876> upsert into testarray values (1, 
array[1,2]);
1 row affected (0.056 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876> select * from testarray;
+-+-+
| ID  | EVENTS  |
+-+-+
| 1   | [1, 2]  |
+-+-+
1 row selected (0.068 seconds)
0: jdbc:phoenix:thin:url=http://localhost:876>


Thanks,
Sergey

On Thu, Apr 19, 2018 at 12:57 PM, Lu Wei 
<wey...@outlook.com<mailto:wey...@outlook.com>> wrote:

by the way, all the queries are shot in sqlline-thin.py



________
发件人: Lu Wei
发送时间: 2018年4月19日 6:51:15
收件人: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
主题: 答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY 
column


## Version:
phoenix: 4.13.2-cdh5.11.2
hive: 1.1.0-cdh5.11.2

to reproduce:

-- create table

create table if not exists testarray(id bigint not null, events bigint array 
constraint pk primary key (id))


-- upsert data:

upsert into testarray values (1, array[1,2]);


-- query:

select id from testarray;   -- fine

select * from testarray;-- error


发件人: sergey.solda...@gmail.com<mailto:sergey.solda...@gmail.com> 
<sergey.solda...@gmail.com<mailto:sergey.solda...@gmail.com>> 代表 Sergey 
Soldatov <sergeysolda...@gmail.com<mailto:sergeysolda...@gmail.com>>
发送时间: 2018年4月19日 6:37:06
收件人: user@phoenix.apache.org<mailto:user@phoenix.apache.org>
主题: Re: phoenix query server java.lang.ClassCastException for BIGINT ARRAY 
column

Could you please be more specific? Which version of phoenix are you using? Do 
you have a small script to reproduce? At first glance it looks like a PQS bug.

Thanks,
Sergey

On Thu, Apr 19, 2018 at 8:17 AM, Lu Wei 
<wey...@outlook.com<mailto:wey...@outlook.com>> wrote:

Hi there,

I have a phoenix table containing an BIGINT ARRAY column. But when querying 
query server (through sqlline-thin.py), there is an exception:

java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long

BTW, when query through sqlline.py, everything works fine. And data in HBase 
table are of Long type, so why does the Integer to Long cast happen?



## Table schema:

create table if not exists gis_tracking3(tracking_object_id bigint not null, 
lat double, lon double, speed double, bearing double, time timestamp not null, 
events bigint array constraint pk primary key (tracking_object_id, time))


## when query events[1], it works fine:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events[1]+1 from 
gis_tracking3;
+--+
| (ARRAY_ELEM(EVENTS, 1) + 1)  |
+--+
| 11   |
| 2223 |
| null |
| null |
| 10001|
+--+



## when querying events, it throws exception:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events from 
gis_tracking3;
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$LongAccessor.getLong(AbstractCursor.java:550)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.convertValue(AbstractCursor.java:1310)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getObject(AbstractCursor.java:1289)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$A

答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY column

2018-04-19 Thread Lu Wei
## Version:
phoenix: 4.13.2-cdh5.11.2
hive: 1.1.0-cdh5.11.2

to reproduce:

-- create table

create table if not exists testarray(id bigint not null, events bigint array 
constraint pk primary key (id))


-- upsert data:

upsert into testarray values (1, array[1,2]);


-- query:

select id from testarray;   -- fine

select * from testarray;-- error


发件人: sergey.solda...@gmail.com <sergey.solda...@gmail.com> 代表 Sergey Soldatov 
<sergeysolda...@gmail.com>
发送时间: 2018年4月19日 6:37:06
收件人: user@phoenix.apache.org
主题: Re: phoenix query server java.lang.ClassCastException for BIGINT ARRAY 
column

Could you please be more specific? Which version of phoenix are you using? Do 
you have a small script to reproduce? At first glance it looks like a PQS bug.

Thanks,
Sergey

On Thu, Apr 19, 2018 at 8:17 AM, Lu Wei 
<wey...@outlook.com<mailto:wey...@outlook.com>> wrote:

Hi there,

I have a phoenix table containing an BIGINT ARRAY column. But when querying 
query server (through sqlline-thin.py), there is an exception:

java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long

BTW, when query through sqlline.py, everything works fine. And data in HBase 
table are of Long type, so why does the Integer to Long cast happen?



## Table schema:

create table if not exists gis_tracking3(tracking_object_id bigint not null, 
lat double, lon double, speed double, bearing double, time timestamp not null, 
events bigint array constraint pk primary key (tracking_object_id, time))


## when query events[1], it works fine:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events[1]+1 from 
gis_tracking3;
+--+
| (ARRAY_ELEM(EVENTS, 1) + 1)  |
+--+
| 11   |
| 2223 |
| null |
| null |
| 10001|
+--+



## when querying events, it throws exception:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events from 
gis_tracking3;
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$LongAccessor.getLong(AbstractCursor.java:550)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.convertValue(AbstractCursor.java:1310)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getObject(AbstractCursor.java:1289)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getArray(AbstractCursor.java:1342)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getString(AbstractCursor.java:1354)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.AvaticaResultSet.getString(AvaticaResultSet.java:257)
  at sqlline.Rows$Row.(Rows.java:183)
  at sqlline.BufferedRows.(BufferedRows.java:38)
  at sqlline.SqlLine.print(SqlLine.java:1660)
  at sqlline.Commands.execute(Commands.java:833)
  at sqlline.Commands.sql(Commands.java:732)
  at sqlline.SqlLine.dispatch(SqlLine.java:813)
  at sqlline.SqlLine.begin(SqlLine.java:686)
  at sqlline.SqlLine.start(SqlLine.java:398)
  at sqlline.SqlLine.main(SqlLine.java:291)
  at 
org.apache.phoenix.queryserver.client.SqllineWrapper.main(SqllineWrapper.java:93)



I guess there is some issue in query sever, but can't figure out why.

Any suggestions?



Thanks,

Wei



答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY column

2018-04-19 Thread Lu Wei
by the way, all the queries are shot in sqlline-thin.py




发件人: Lu Wei
发送时间: 2018年4月19日 6:51:15
收件人: user@phoenix.apache.org
主题: 答复: phoenix query server java.lang.ClassCastException for BIGINT ARRAY 
column


## Version:
phoenix: 4.13.2-cdh5.11.2
hive: 1.1.0-cdh5.11.2

to reproduce:

-- create table

create table if not exists testarray(id bigint not null, events bigint array 
constraint pk primary key (id))


-- upsert data:

upsert into testarray values (1, array[1,2]);


-- query:

select id from testarray;   -- fine

select * from testarray;-- error


发件人: sergey.solda...@gmail.com <sergey.solda...@gmail.com> 代表 Sergey Soldatov 
<sergeysolda...@gmail.com>
发送时间: 2018年4月19日 6:37:06
收件人: user@phoenix.apache.org
主题: Re: phoenix query server java.lang.ClassCastException for BIGINT ARRAY 
column

Could you please be more specific? Which version of phoenix are you using? Do 
you have a small script to reproduce? At first glance it looks like a PQS bug.

Thanks,
Sergey

On Thu, Apr 19, 2018 at 8:17 AM, Lu Wei 
<wey...@outlook.com<mailto:wey...@outlook.com>> wrote:

Hi there,

I have a phoenix table containing an BIGINT ARRAY column. But when querying 
query server (through sqlline-thin.py), there is an exception:

java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long

BTW, when query through sqlline.py, everything works fine. And data in HBase 
table are of Long type, so why does the Integer to Long cast happen?



## Table schema:

create table if not exists gis_tracking3(tracking_object_id bigint not null, 
lat double, lon double, speed double, bearing double, time timestamp not null, 
events bigint array constraint pk primary key (tracking_object_id, time))


## when query events[1], it works fine:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events[1]+1 from 
gis_tracking3;
+--+
| (ARRAY_ELEM(EVENTS, 1) + 1)  |
+--+
| 11   |
| 2223 |
| null |
| null |
| 10001|
+--+



## when querying events, it throws exception:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events from 
gis_tracking3;
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$LongAccessor.getLong(AbstractCursor.java:550)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.convertValue(AbstractCursor.java:1310)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getObject(AbstractCursor.java:1289)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getArray(AbstractCursor.java:1342)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getString(AbstractCursor.java:1354)
  at 
org.apache.phoenix.shaded.org<http://org.apache.phoenix.shaded.org>.apache.calcite.avatica.AvaticaResultSet.getString(AvaticaResultSet.java:257)
  at sqlline.Rows$Row.(Rows.java:183)
  at sqlline.BufferedRows.(BufferedRows.java:38)
  at sqlline.SqlLine.print(SqlLine.java:1660)
  at sqlline.Commands.execute(Commands.java:833)
  at sqlline.Commands.sql(Commands.java:732)
  at sqlline.SqlLine.dispatch(SqlLine.java:813)
  at sqlline.SqlLine.begin(SqlLine.java:686)
  at sqlline.SqlLine.start(SqlLine.java:398)
  at sqlline.SqlLine.main(SqlLine.java:291)
  at 
org.apache.phoenix.queryserver.client.SqllineWrapper.main(SqllineWrapper.java:93)



I guess there is some issue in query sever, but can't figure out why.

Any suggestions?



Thanks,

Wei



phoenix query server java.lang.ClassCastException for BIGINT ARRAY column

2018-04-18 Thread Lu Wei
Hi there,

I have a phoenix table containing an BIGINT ARRAY column. But when querying 
query server (through sqlline-thin.py), there is an exception:

java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long

BTW, when query through sqlline.py, everything works fine. And data in HBase 
table are of Long type, so why does the Integer to Long cast happen?



## Table schema:

create table if not exists gis_tracking3(tracking_object_id bigint not null, 
lat double, lon double, speed double, bearing double, time timestamp not null, 
events bigint array constraint pk primary key (tracking_object_id, time))


## when query events[1], it works fine:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events[1]+1 from 
gis_tracking3;
+--+
| (ARRAY_ELEM(EVENTS, 1) + 1)  |
+--+
| 11   |
| 2223 |
| null |
| null |
| 10001|
+--+



## when querying events, it throws exception:

0: jdbc:phoenix:thin:url=http://10.10.13.87:8> select  events from 
gis_tracking3;
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.util.AbstractCursor$LongAccessor.getLong(AbstractCursor.java:550)
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.convertValue(AbstractCursor.java:1310)
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getObject(AbstractCursor.java:1289)
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getArray(AbstractCursor.java:1342)
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.util.AbstractCursor$ArrayAccessor.getString(AbstractCursor.java:1354)
  at 
org.apache.phoenix.shaded.org.apache.calcite.avatica.AvaticaResultSet.getString(AvaticaResultSet.java:257)
  at sqlline.Rows$Row.(Rows.java:183)
  at sqlline.BufferedRows.(BufferedRows.java:38)
  at sqlline.SqlLine.print(SqlLine.java:1660)
  at sqlline.Commands.execute(Commands.java:833)
  at sqlline.Commands.sql(Commands.java:732)
  at sqlline.SqlLine.dispatch(SqlLine.java:813)
  at sqlline.SqlLine.begin(SqlLine.java:686)
  at sqlline.SqlLine.start(SqlLine.java:398)
  at sqlline.SqlLine.main(SqlLine.java:291)
  at 
org.apache.phoenix.queryserver.client.SqllineWrapper.main(SqllineWrapper.java:93)



I guess there is some issue in query sever, but can't figure out why.

Any suggestions?



Thanks,

Wei


Exception of phoenix hive integration

2018-04-14 Thread Lu Wei
## Version:

phoenix: 4.13.2-cdh5.11.2

hive: 1.1.0-cdh5.11.2


There is an ColumnNotFoundException when joining a hive internal table with an 
Phoenix external table.



## Table1: phoenix external table "ext_tmp":

+---+++--+

| col_name  | data_type  |  comment   |

+---+++--+

| cola  | string | from deserializer  |

| colb  | string | from deserializer  |

+---+++--+


### Backend Phoenix table "TMP":

select * from TMP;

+---++

| cola  |colb|

+---++

| a |    |

| b | bb |

| ccc   | cccbb  |

+---++


### Hive external table creation statement:


++--+

|   createtab_stmt   |

++--+

| CREATE EXTERNAL TABLE `ext_tmp`(   |

|   `cola` string COMMENT 'from deserializer',   |

|   `colb` string COMMENT 'from deserializer')   |

| ROW FORMAT SERDE   |

|   'org.apache.phoenix.hive.PhoenixSerDe'   |

| STORED BY  |

|   'org.apache.phoenix.hive.PhoenixStorageHandler'  |

| WITH SERDEPROPERTIES ( |

|   'serialization.format'='1')  |

| LOCATION   |

|   'hdfs://st:8020/data/user/hive/warehouse/ext_tmp'|

| TBLPROPERTIES (|

|   'phoenix.column.mapping'='cola:cola,colb:colb',  |

|   'phoenix.rowkeys'='cola',|

|   'phoenix.table.name'='tmp',  |

|   'phoenix.zookeeper.client.port'='2181',  |

|   'phoenix.zookeeper.quorum'='st1,st2,st3',|

|   'phoenix.zookeeper.znode.parent'='/hbase',   |

|   'transient_lastDdlTime'='1523607352')|

++--+



## Table2: hive internal table "native1":

+---++--+--+

| col_name  | data_type  | comment  |

+---++--+--+

| cola  | string |  |

| colb  | string |  |

+---++--+--+



## When join the two tables:


select *  from native1  join  ext_tmp t on native1.cola= t.cola;



Exception:

org.apache.phoenix.schema.ColumnNotFoundException: ERROR 504 (42703): Undefined 
column. columnName=TMP


Detailed exception as below. There is an empty column "" as readcolumn name, 
which is not exsit in Pheonix table at all. so the phoenix query will never be 
correct: select /*+ NO_CACHE  */ "","cola","colb" from tmp where "cola" is not 
null


Any thoughts?



-

2018-04-14 21:13:40,923 INFO  org.apache.hadoop.hive.ql.io.HiveInputFormat: 
[HiveServer2-Background-Pool: Thread-304]: hive.io.file.readcolumn.ids=

2018-04-14 21:13:40,923 INFO  org.apache.hadoop.hive.ql.io.HiveInputFormat: 
[HiveServer2-Background-Pool: Thread-304]: 
hive.io.file.readcolumn.names=,cola,colb

2018-04-14 21:13:40,923 INFO  org.apache.hadoop.hive.ql.io.HiveInputFormat: 
[HiveServer2-Background-Pool: Thread-304]: Generating splits

2018-04-14 21:13:40,924 INFO  
org.apache.phoenix.hive.query.PhoenixQueryBuilder: 
[HiveServer2-Background-Pool: Thread-304]: Input query : select /*+ NO_CACHE  
*/ "","cola","colb" from tmp where "cola" is not null

2018-04-14 21:13:40,932 ERROR 
org.apache.phoenix.hive.mapreduce.PhoenixInputFormat: 
[HiveServer2-Background-Pool: Thread-304]: Failed to get the query plan with 
error [ERROR 504 (42703): Undefined column. columnName=TMP]


Thanks,

Wei