Premchandra Preetham Kukillaya created SPARK-8659:
-----------------------------------------------------

             Summary: SQL Standard Based Hive Authorisation of Hive.13 does not 
work while pointing JDBC Application to Spark Thrift Server. 
                 Key: SPARK-8659
                 URL: https://issues.apache.org/jira/browse/SPARK-8659
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.1
         Environment: Linux
            Reporter: Premchandra Preetham Kukillaya


It seems like while pointing JDBC/ODBC Driver to Spark SQL Thrift Service 
Hive's feature SQL based authorization is not working whereas SQL based 
Authorization works when i am pointing the JDBC Driver to ThriftCLIService 
provided by HiveServer2.

The problem is user X can do select on table belonging to user Y.

I am using Hive .13.1 and Spark 1.3.1


./start-thriftserver.sh --hiveconf hive.server2.thrift.port=10001 --hiveconf 
xxxxhostname.compute.amazonaws.com --hiveconf 
hive.security.authenticator.manager=org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator
 --hiveconf 
hive.security.authorization.manager=org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory
 --hiveconf hive.server2.enable.doAs=false --hiveconf 
hive.security.authorization.enabled=true --hiveconf mapred.reduce.tasks=-1 
--hiveconf mapred.max.split.size=256000000 --hiveconf 
hive.downloaded.resources.dir=/mnt/var/lib/hive/downloaded_resources --hiveconf 
javax.jdo.option.ConnectionURL=jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true
 --hiveconf javax.jdo.option.ConnectionDriverName=com.mysql.jdbc.Driver 
--hiveconf javax.jdo.option.ConnectionUserName=hive --hiveconf 
javax.jdo.option.ConnectionPassword=hive --hiveconf 
hive.metastore.warehouse.dir=/user/hive/warehouse --hiveconf 
hive.metastore.connect.retries=5 --hiveconf datanucleus.fixedDatastore=true



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to