zhangqw created SENTRY-1591:
-------------------------------

             Summary: Spark DataFrame saveAsTable not working properly after 
Sentry turned on
                 Key: SENTRY-1591
                 URL: https://issues.apache.org/jira/browse/SENTRY-1591
             Project: Sentry
          Issue Type: Bug
          Components: Sentry
    Affects Versions: 1.5.1
         Environment: CentOS 6.5 / Spark 2.0.0 / Hive 1.1.0
            Reporter: zhangqw


Execute spark as user: test whohas privilege _ALL_ on database _db_test_.
{code:title=scala|borderStyle=solid}
import org.apache.spark.sql._
val table=spark.table("db_test.table1")
table.write.mode(SaveMode.Overwrite).saveAsTable("db_test.table2")
{code}
part of the Log: 
{noformat}
org.apache.spark.sql.AnalysisException: 
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:User 
p2_g18 does not have privileges for CREATETABLE);
         at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:75)
         at org.apache.spark.sql.hive.HiveExternalCatalog. 
createTable(HiveExternalCatalog.scala:152)
         at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:226)
         at 
org.apache.spark.sql.execution.command.CreateDataSourceTableUtils$.createDataSourceTable(createDataSourceTables.scala:487)
         at 
org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:256)
         at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
         at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)
         at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
         at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
         at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
         at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
         at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
         at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
         at 
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
         at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)
         at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)
         at 
org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:378)
... ...
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
MetaException(message:User p2_g18 does not have privileges for CREATETABLE)
         at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:720)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:403)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:403)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:403)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:261)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:208)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:207)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:250)
         at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:402)
         at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$mcV$sp(HiveExternalCatalog.scala:188)
         at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:152)
         at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:152)
         at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:72)
         ... 67 more
Caused by: MetaException(message:User p2_g18 does not have privileges for 
CREATETABLE)
         at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29983)
         at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29951)
         at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:29877)
         at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
         at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1075)
         at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1061)
         at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2050)
         at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
         at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:669)
         at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:657)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
         at com.sun.proxy.$Proxy38.createTable(Unknown Source)
         at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:714)
         ... 79 more
{noformat}

And db_test.table2 can only read by spark. If I try to read it via hive 
exception occur:
{noformat}
Error: java.io.IOException: java.io.IOException: 
hdfs://namenodeha/apps/hive/warehouse/db_test.db/table2/part-r-00000-76fadf03-e477-42e3-acb2-22819dca534e.snappy.parquet
 not a SequenceFile (state=,code=0)
{noformat}

But if I run the same code as user who has _ALL_ privilege on hive warehouse, 
everything is ok.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to