[ 
https://issues.apache.org/jira/browse/SPARK-19661?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15877441#comment-15877441
 ] 

sydt commented on SPARK-19661:
------------------------------

Yeah,you are right.It is accomplished by hive-hbase-handler-0.13.1.jar 
actually. In hive, we can create table by
SET hbase.zookeeper.quorum=zkNode1,zkNode2,zkNode3; 
SET zookeeper.znode.parent=/hbase;
ADD jar /usr/local/apache-hive-0.13.1-bin/lib/hive-hbase-handler-0.13.1.jar;
CREATE EXTERNAL TABLE lxw1234 (
rowkey string,
f1 map<STRING,STRING>,
f2 map<STRING,STRING>,
f3 map<STRING,STRING>
) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,f1:,f2:,f3:")
TBLPROPERTIES ("hbase.table.name" = "lxw1234");
After  
INSERT INTO TABLE lxw1234 
SELECT 'row1' AS rowkey,
map('c3','name3') AS f1,
map('c3','age3') AS f2,
map('c4','job3') AS f3 
FROM DUAL 
limit 1;
now, hive can access table lxw1234 like ordinary external table and it it 
stored in hbase. 

> Spark-2.1.0 can not connect hbase
> ---------------------------------
>
>                 Key: SPARK-19661
>                 URL: https://issues.apache.org/jira/browse/SPARK-19661
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 2.1.0
>            Reporter: sydt
>
> When spark-sql of spark-2.1.0 connect hbase by
> CREATE EXTERNAL TABLE lxw123(  
> rowkey string,  
> f1 map<STRING,STRING>,  
> f2 map<STRING,STRING>,  
> f3 map<STRING,STRING>  
> ) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'  
> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,f1:,f2:,f3:")  
> TBLPROPERTIES ("hbase.table.name" = "lxw1234");
> it has no response and show 
> Error in query: 
> Operation not allowed: STORED BY(line 6, pos 2)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to