Hi,

 

I am working with Sentry + Hive Server. I want to disable the insert
permission of user jim who can only select from a table. I implemented
my own class that extends
org.apache.sentry.binding.hive.v2.metastore.MetastoreAuthzBindingV2. I
understand I should response Hive Events: READ_TABLE, ALTER_TABLE of API
org.apache.sentry.binding.hive.v2.metastore.MetastoreAuthzBindingV2.onEv
ent(). So an
MetaException/NoSuchObjectException/InvalidOperationException (I tried
all of them) is thrown for ALTER_TABLE. However, when I check the table,
the data is actually inserted.

 

The client receives this exception. Why the data is inserted? I want to
know what's wrong in my program? Is it a bug?

 

Thank you!

 

 

Qin An.

------------------------

Here is the message on the client side:

 

INFO: Will try to open client transport with JDBC Uri:
jdbc:hive2://localhost:10006/default

Invoking server with user jim and SQL: insert into test_tuk values(1004,
'Tom', 'Lee', 35)

Before executeQuery..

e is java.sql.SQLException: org.apache.spark.sql.AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
User jim's group is not authorized.;

java.sql.SQLException: org.apache.spark.sql.AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
User jim's group is not authorized.;

        at
org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)

        at
org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:392)

        at
com.vitria.spark2.TestAuthClient.invoke(TestAuthClient.java:111)

        at com.vitria.spark2.TestAuthClient.run(TestAuthClient.java:83)

        at java.lang.Thread.run(Unknown Source)

 

Here is the messages from Spark log:

 

2017-12-06 15:30:36,713 INFO  [com.vitria.spark.StdOutErrLog]
(HiveServer2-Handler-Pool: Thread-633;) ####
VtAuthenticationImpl:Authenticate user jim  password 123456

2017-12-06 15:30:36,792 INFO  [org.xnio] (HiveServer2-Handler-Pool:
Thread-633;) XNIO version 3.3.1.Final

2017-12-06 15:30:36,801 INFO  [org.xnio.nio] (HiveServer2-Handler-Pool:
Thread-633;) XNIO NIO Implementation Version 3.3.1.Final

2017-12-06 15:30:36,835 INFO  [org.jboss.remoting]
(HiveServer2-Handler-Pool: Thread-633;) JBoss Remoting version
4.0.9.Final

2017-12-06 15:30:37,003 INFO  [org.jboss.ejb.client.remoting] (Remoting
"config-based-naming-client-endpoint" task-5;) EJBCLIENT000017: Received
server version 2 and marshalling strategies [river]

2017-12-06 15:30:37,013 INFO  [org.jboss.ejb.client.remoting]
(HiveServer2-Handler-Pool: Thread-633;) EJBCLIENT000013: Successful
version handshake completed for receiver context
EJBReceiverContext{clientContext=org.jboss.ejb.client.EJBClientContext@5
47351f8, receiver=Remoting connection EJB receiver [connection=Remoting
connection <176d2451>,channel=jboss.ejb,nodename=pek-wkst68446]} on
channel Channel ID 99975e51 (outbound) of Remoting connection 0d681767
to /10.101.5.54:8080

2017-12-06 15:30:37,287 INFO  [org.jboss.ejb.client]
(HiveServer2-Handler-Pool: Thread-633;) JBoss EJB Client version
2.1.1.Final

2017-12-06 15:30:37,394 INFO  [org.jboss.ejb.client.remoting] (Remoting
"config-based-naming-client-endpoint" task-9;) EJBCLIENT000016: Channel
Channel ID 99975e51 (outbound) of Remoting connection 0d681767 to
/10.101.5.54:8080 can no longer process messages

2017-12-06 15:30:37,791 INFO
[org.apache.spark.sql.execution.SparkSqlParser]
(HiveServer2-Handler-Pool: Thread-633;) Parsing command: use default

2017-12-06 15:30:37,866 INFO  [DataNucleus.Query]
(HiveServer2-Handler-Pool: Thread-633;) Reading in results for query
"org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used
is closing

2017-12-06 15:30:37,869 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding]
(HiveServer2-Handler-Pool: Thread-633;) ######
VtMetastoreAuthzBinding:onEvent: Thread Thread[HiveServer2-Handler-Pool:
Thread-633,5,main]  userName = aqin  event type = READ_DATABASE

2017-12-06 15:30:37,869 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding]
(HiveServer2-Handler-Pool: Thread-633;) ###### authorize local user:
aqin

2017-12-06 15:30:37,943 INFO
[org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation]
(pool-24-thread-1;) Running query 'insert into test_tuk values(1008,
'Tom', 'Lee', 35)' with 48ca7164-5c4a-4a78-af23-9f44c2b910d7

2017-12-06 15:30:37,944 INFO
[org.apache.spark.sql.execution.SparkSqlParser] (pool-24-thread-1;)
Parsing command: insert into test_tuk values(1008, 'Tom', 'Lee', 35)

2017-12-06 15:30:38,106 INFO  [DataNucleus.Query] (pool-24-thread-1;)
Reading in results for query
"org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used
is closing

2017-12-06 15:30:38,242 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding] (pool-24-thread-1;)
###### VtMetastoreAuthzBinding:onEvent: Thread
Thread[pool-24-thread-1,5,main]  userName = jim  event type = READ_TABLE

2017-12-06 15:30:38,243 INFO  [com.vitria.spark.auth.DriverAuth]
(pool-24-thread-1;) #### DriverAuth::authorize: jmxUrl_ =
service:jmx:rmi://0.0.0.0:53741/jndi/rmi://PEK-WKST68446:53742/jmxrmi
username = jim

2017-12-06 15:30:40,609 INFO
[org.apache.spark.sql.catalyst.parser.CatalystSqlParser]
(pool-24-thread-1;) Parsing command: int

2017-12-06 15:30:40,621 INFO
[org.apache.spark.sql.catalyst.parser.CatalystSqlParser]
(pool-24-thread-1;) Parsing command: varchar(255)

2017-12-06 15:30:40,622 INFO
[org.apache.spark.sql.catalyst.parser.CatalystSqlParser]
(pool-24-thread-1;) Parsing command: varchar(255)

2017-12-06 15:30:40,622 INFO
[org.apache.spark.sql.catalyst.parser.CatalystSqlParser]
(pool-24-thread-1;) Parsing command: int

2017-12-06 15:30:40,981 INFO
[org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator]
(pool-24-thread-1;) Code generated in 7.467433 ms

2017-12-06 15:30:41,881 INFO  [org.apache.spark.SparkContext]
(pool-24-thread-1;) Starting job: run at <unknown>:0

2017-12-06 15:30:41,899 INFO  [org.apache.spark.SparkContext]
(dag-scheduler-event-loop;) Created broadcast 2 from broadcast at
DAGScheduler.scala:996

2017-12-06 15:30:41,902 INFO  [org.apache.spark.executor.Executor]
(Executor task launch worker for task 2;) Running task 0.0 in stage 2.0
(TID 2)

2017-12-06 15:30:42,422 INFO
[org.apache.spark.mapred.SparkHadoopMapRedUtil] (Executor task launch
worker for task 2;) attempt_20171206153041_0002_m_000000_0: Committed

2017-12-06 15:30:42,425 INFO  [org.apache.spark.executor.Executor]
(Executor task launch worker for task 2;) Finished task 0.0 in stage 2.0
(TID 2). 1224 bytes result sent to driver

2017-12-06 15:30:42,548 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding] (pool-24-thread-1;)
###### VtMetastoreAuthzBinding:onEvent: Thread
Thread[pool-24-thread-1,5,main]  userName = jim  event type = READ_TABLE

2017-12-06 15:30:42,548 INFO  [com.vitria.spark.auth.DriverAuth]
(pool-24-thread-1;) #### DriverAuth::authorize: jmxUrl_ =
service:jmx:rmi://0.0.0.0:53741/jndi/rmi://PEK-WKST68446:53742/jmxrmi
username = jim

2017-12-06 15:30:42,623 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding] (pool-24-thread-1;)
###### VtMetastoreAuthzBinding:onEvent: Thread
Thread[pool-24-thread-1,5,main]  userName = jim  event type = READ_TABLE

2017-12-06 15:30:42,623 INFO  [com.vitria.spark.auth.DriverAuth]
(pool-24-thread-1;) #### DriverAuth::authorize: jmxUrl_ =
service:jmx:rmi://0.0.0.0:53741/jndi/rmi://PEK-WKST68446:53742/jmxrmi
username = jim

2017-12-06 15:30:42,658 INFO  [hive.ql.metadata.Hive]
(pool-24-thread-1;) Renaming src:
hdfs://10.101.3.128:9090/user/anqin/test_tuk/.hive-staging_hive_2017-12-
06_15-30-40_998_8671993461936031021-1/-ext-10000/part-00000, dest:
hdfs://10.101.3.128:9090/user/anqin/test_tuk/part-00000_copy_8,
Status:true

2017-12-06 15:30:42,675 ERROR [com.vitria.spark.StdOutErrLog]
(pool-24-thread-1;) chmod: changing permissions of
'hdfs://10.101.3.128:9090/user/anqin/test_tuk/part-00000_copy_8':
Permission denied. user=jim is not the owner of inode=part-00000_copy_8

2017-12-06 15:30:42,758 INFO
[com.vitria.spark.auth.VtMetastoreAuthzBinding] (pool-24-thread-1;)
###### VtMetastoreAuthzBinding:onEvent: Thread
Thread[pool-24-thread-1,5,main]  userName = jim  event type =
ALTER_TABLE

2017-12-06 15:30:42,759 INFO  [com.vitria.spark.auth.DriverAuth]
(pool-24-thread-1;) #### DriverAuth::authorize: jmxUrl_ =
service:jmx:rmi://0.0.0.0:53741/jndi/rmi://PEK-WKST68446:53742/jmxrmi
username = jim

2017-12-06 15:30:42,796 ERROR
[com.vitria.spark.auth.VtMetastoreAuthzBinding] (pool-24-thread-1;)
Failed to authorize user jim to table test_tuk

2017-12-06 15:30:42,895 ERROR
[org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation]
(pool-24-thread-1;) Error executing query, currentState RUNNING, 

org.apache.spark.sql.AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
User jim's group is not authorized. InvalidOperationException;

   at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCat
alog.scala:106)

   at
org.apache.spark.sql.hive.HiveExternalCatalog.loadTable(HiveExternalCata
log.scala:766)

   at
org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult
$lzycompute(InsertIntoHiveTable.scala:374)

   at
org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult
(InsertIntoHiveTable.scala:221)

   at
org.apache.spark.sql.hive.execution.InsertIntoHiveTable.doExecute(Insert
IntoHiveTable.scala:407)

   at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkP
lan.scala:114)

   at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkP
lan.scala:114)

   at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(S
parkPlan.scala:135)

   at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scal
a:151)

   at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:13
2)

   at
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)

   at
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExec
ution.scala:92)

   at
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala
:92)

   at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)

   at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)

   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)

   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.or
g$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$exe
cute(SparkExecuteStatementOperation.scala:231)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1$$anon$2.run(SparkExecuteStatementOperation.scala:174)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)

   at java.security.AccessController.doPrivileged(Native Method)

   at javax.security.auth.Subject.doAs(Unknown Source)

   at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
n.java:1698)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1.run(SparkExecuteStatementOperation.scala:184)

   at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
Source)

   at java.util.concurrent.FutureTask.run(Unknown Source)

   at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)

   at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)

   at java.lang.Thread.run(Unknown Source)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to
alter table. User jim's group is not authorized.
InvalidOperationException

   at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:498)

   at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:484)

   at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1668)

   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

   at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

   at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

   at java.lang.reflect.Method.invoke(Unknown Source)

   at
org.apache.spark.sql.hive.client.Shim_v0_14.loadTable(HiveShim.scala:728
)

   at
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadTable$1.app
ly$mcV$sp(HiveClientImpl.scala:676)

   at
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadTable$1.app
ly(HiveClientImpl.scala:676)

   at
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadTable$1.app
ly(HiveClientImpl.scala:676)

   at
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1
.apply(HiveClientImpl.scala:279)

   at
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClient
Impl.scala:226)

   at
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientIm
pl.scala:225)

   at
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClient
Impl.scala:268)

   at
org.apache.spark.sql.hive.client.HiveClientImpl.loadTable(HiveClientImpl
.scala:675)

   at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadTable$1.apply
$mcV$sp(HiveExternalCatalog.scala:768)

   at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadTable$1.apply
(HiveExternalCatalog.scala:766)

   at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadTable$1.apply
(HiveExternalCatalog.scala:766)

   at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCat
alog.scala:97)

   ... 28 more

Caused by: MetaException(message:User jim's group is not authorized.
InvalidOperationException)

   at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.firePreEvent(H
iveMetaStore.java:1996)

   at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_co
re(HiveMetaStore.java:3407)

   at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_wi
th_cascade(HiveMetaStore.java:3380)

   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

   at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

   at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

   at java.lang.reflect.Method.invoke(Unknown Source)

   at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHa
ndler.java:107)

   at com.sun.proxy.$Proxy12.alter_table_with_cascade(Unknown Source)

   at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMet
aStoreClient.java:340)

   at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.alter_tabl
e(SessionHiveMetaStoreClient.java:251)

   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

   at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

   at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

   at java.lang.reflect.Method.invoke(Unknown Source)

   at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retrying
MetaStoreClient.java:156)

   at com.sun.proxy.$Proxy13.alter_table(Unknown Source)

   at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:496)

   ... 47 more

2017-12-06 15:30:42,902 ERROR
[org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation]
(pool-24-thread-1;) Error running hive query: 

org.apache.hive.service.cli.HiveSQLException:
org.apache.spark.sql.AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
User jim's group is not authorized. InvalidOperationException;

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.or
g$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$exe
cute(SparkExecuteStatementOperation.scala:266)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1$$anon$2.run(SparkExecuteStatementOperation.scala:174)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)

   at java.security.AccessController.doPrivileged(Native Method)

   at javax.security.auth.Subject.doAs(Unknown Source)

   at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
n.java:1698)

   at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$a
non$1.run(SparkExecuteStatementOperation.scala:184)

   at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
Source)

   at java.util.concurrent.FutureTask.run(Unknown Source)

   at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)

   at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)

   at java.lang.Thread.run(Unknown Source)

 

 

 

 

 

Reply via email to