[jira] [Updated] (SPARK-24338) Spark SQL fails to create a table in Hive when running in a Apache Sentry-secured Environment

2018-05-21 Thread Chaoran Yu (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chaoran Yu updated SPARK-24338:
---
Description: 
This 
[commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428]
 introduced a bug that caused Spark SQL "CREATE TABLE" statement to fail in 
Hive when Apache Sentry is used to control cluster authorization. This bug 
exists in Spark 2.1.0 and all later releases. The error message thrown is in 
the attached file.[^exception.txt]

Cloudera in their fork of Spark fixed this bug as shown 
[here|https://github.com/cloudera/spark/blob/spark2-2.2.0-cloudera2/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala#L229].
 It would make sense for this fix to be merged back upstream.

  was:
This 
[commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428]
 introduced a bug that caused Spark SQL "CREATE TABLE" statement to fail in 
Hive when Apache Sentry is used to control cluster authorization. This bug 
exists in Spark 2.1.0 and all later releases. The error message thrown is the 
following:

org.apache.spark.sql.AnalysisException: 
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:User 
soadusr does not have privileges for CREATETABLE);
 at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
 at 
org.apache.spark.sql.hive.HiveExternalCatalog.doCreateTable(HiveExternalCatalog.scala:215)
 at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createTable(ExternalCatalog.scala:110)
 at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:316)
 at 
org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:127)
 at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
 at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
 at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
 at org.apache.spark.sql.Dataset.(Dataset.scala:182)
 at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
 at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)
 at test.HiveTestJob$.runJob(HiveTestJob.scala:86)
 at test.HiveTestJob$.runJob(HiveTestJob.scala:73)
 at 
spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$8.apply(JobManagerActor.scala:407)
 at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
 at 
monitoring.MdcPropagatingExecutionContext$$anon$1.run(MdcPropagatingExecutionContext.scala:24)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
MetaException(message:User soadusr does not have privileges for CREATETABLE)
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:720)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:446)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:290)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:231)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:230)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
 at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:445)
 at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply$mcV$sp(HiveExternalCatalog.scala:256)
 at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
 at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
 at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
 ... 20 more
Caused by: MetaException(message:User soadusr does not have privileges for 
CREATETABLE)
 at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29983)
 at 

[jira] [Updated] (SPARK-24338) Spark SQL fails to create a table in Hive when running in a Apache Sentry-secured Environment

2018-05-21 Thread Chaoran Yu (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chaoran Yu updated SPARK-24338:
---
Attachment: exception.txt

> Spark SQL fails to create a table in Hive when running in a Apache 
> Sentry-secured Environment
> -
>
> Key: SPARK-24338
> URL: https://issues.apache.org/jira/browse/SPARK-24338
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.3.0
>Reporter: Chaoran Yu
>Priority: Critical
> Attachments: exception.txt
>
>
> This 
> [commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428]
>  introduced a bug that caused Spark SQL "CREATE TABLE" statement to fail in 
> Hive when Apache Sentry is used to control cluster authorization. This bug 
> exists in Spark 2.1.0 and all later releases. The error message thrown is the 
> following:
> org.apache.spark.sql.AnalysisException: 
> org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:User 
> soadusr does not have privileges for CREATETABLE);
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.doCreateTable(HiveExternalCatalog.scala:215)
>  at 
> org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createTable(ExternalCatalog.scala:110)
>  at 
> org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:316)
>  at 
> org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:127)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
>  at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
>  at org.apache.spark.sql.Dataset.(Dataset.scala:182)
>  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
>  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
>  at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)
>  at test.HiveTestJob$.runJob(HiveTestJob.scala:86)
>  at test.HiveTestJob$.runJob(HiveTestJob.scala:73)
>  at 
> spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$8.apply(JobManagerActor.scala:407)
>  at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>  at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>  at 
> monitoring.MdcPropagatingExecutionContext$$anon$1.run(MdcPropagatingExecutionContext.scala:24)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:User soadusr does not have privileges for CREATETABLE)
>  at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:720)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:446)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:290)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:231)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:230)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:445)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply$mcV$sp(HiveExternalCatalog.scala:256)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  ... 20 more
> Caused by: MetaException(message:User soadusr does not have privileges for 
> CREATETABLE)
>  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29983)
>  at 
>