ethan7811 opened a new issue #803:
URL: https://github.com/apache/submarine/issues/803


   we use Submarine Spark Security to enable to enable kyuubi acl, and we build 
with -Pspark3.1 like https://github.com/apache/submarine/issues/774, but when 
we submit sqls we always enconter errors bellow, like issue: 
https://github.com/apache/submarine/issues/533
   our spark version is 3.1.2 and ranger version is 2.1, when we try to add 
policy to spark_catalog and it doesn't work
   ```
   21/11/18 18:22:49 WARN service.ThriftFrontendService: Error executing 
statement:
   org.apache.kyuubi.KyuubiSQLException: Error operating EXECUTE_STATEMENT: 
org.apache.submarine.spark.security.SparkAccessControlException: Permission 
denied: user [schedule] does not have [USE] privilege on [spark_catalog]
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.$anonfun$checkPrivileges$3(RangerSparkAuthorizer.scala:126)
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.$anonfun$checkPrivileges$3$adapted(RangerSparkAuthorizer.scala:100)
           at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.checkPrivileges(RangerSparkAuthorizer.scala:100)
           at 
org.apache.spark.sql.catalyst.optimizer.SubmarineSparkRangerAuthorizationExtension.apply(SubmarineSparkRangerAuthorizationExtension.scala:65)
           at 
org.apache.spark.sql.catalyst.optimizer.SubmarineSparkRangerAuthorizationExtension.apply(SubmarineSparkRangerAuthorizationExtension.scala:40)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216)
           at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
           at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
           at scala.collection.immutable.List.foldLeft(List.scala:89)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205)
           at scala.collection.immutable.List.foreach(List.scala:392)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$optimizedPlan$1(QueryExecution.scala:87)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
           at 
org.apache.spark.sql.execution.QueryExecution.optimizedPlan$lzycompute(QueryExecution.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution.optimizedPlan(QueryExecution.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution.assertOptimized(QueryExecution.scala:95)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:113)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:110)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$simpleString$2(QueryExecution.scala:161)
           at 
org.apache.spark.sql.execution.ExplainUtils$.processPlan(ExplainUtils.scala:115)
           at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:161)
           at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:206)
           at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:175)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:98)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:92)
           at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.withLocalProperties(ExecuteStatement.scala:143)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.org$apache$kyuubi$engine$spark$operation$ExecuteStatement$$executeStatement(ExecuteStatement.scala:86)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.runInternal(ExecuteStatement.scala:129)
           at 
org.apache.kyuubi.operation.AbstractOperation.run(AbstractOperation.scala:129)
           at 
org.apache.kyuubi.session.AbstractSession.runOperation(AbstractSession.scala:95)
           at 
org.apache.kyuubi.engine.spark.session.SparkSessionImpl.runOperation(SparkSessionImpl.scala:44)
           at 
org.apache.kyuubi.session.AbstractSession.$anonfun$executeStatement$1(AbstractSession.scala:123)
           at 
org.apache.kyuubi.session.AbstractSession.withAcquireRelease(AbstractSession.scala:78)
           at 
org.apache.kyuubi.session.AbstractSession.executeStatement(AbstractSession.scala:120)
           at 
org.apache.kyuubi.service.AbstractBackendService.executeStatement(AbstractBackendService.scala:61)
           at 
org.apache.kyuubi.service.ThriftFrontendService.ExecuteStatement(ThriftFrontendService.scala:256)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
           at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
           at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
           at 
org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
           
           at 
org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:68)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:97)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:81)
           at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:100)
           at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.withLocalProperties(ExecuteStatement.scala:143)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.org$apache$kyuubi$engine$spark$operation$ExecuteStatement$$executeStatement(ExecuteStatement.scala:86)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.runInternal(ExecuteStatement.scala:129)
           at 
org.apache.kyuubi.operation.AbstractOperation.run(AbstractOperation.scala:129)
           at 
org.apache.kyuubi.session.AbstractSession.runOperation(AbstractSession.scala:95)
           at 
org.apache.kyuubi.engine.spark.session.SparkSessionImpl.runOperation(SparkSessionImpl.scala:44)
           at 
org.apache.kyuubi.session.AbstractSession.$anonfun$executeStatement$1(AbstractSession.scala:123)
           at 
org.apache.kyuubi.session.AbstractSession.withAcquireRelease(AbstractSession.scala:78)
           at 
org.apache.kyuubi.session.AbstractSession.executeStatement(AbstractSession.scala:120)
           at 
org.apache.kyuubi.service.AbstractBackendService.executeStatement(AbstractBackendService.scala:61)
           at 
org.apache.kyuubi.service.ThriftFrontendService.ExecuteStatement(ThriftFrontendService.scala:256)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
           at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
           at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
           at 
org.apache.kyuubi.service.authentication.TSetIpAddressProcessor.process(TSetIpAddressProcessor.scala:36)
           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.submarine.spark.security.SparkAccessControlException: 
Permission denied: user [schedule] does not have [USE] privilege on 
[spark_catalog]
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.$anonfun$checkPrivileges$3(RangerSparkAuthorizer.scala:126)
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.$anonfun$checkPrivileges$3$adapted(RangerSparkAuthorizer.scala:100)
           at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at 
org.apache.submarine.spark.security.RangerSparkAuthorizer$.checkPrivileges(RangerSparkAuthorizer.scala:100)
           at 
org.apache.spark.sql.catalyst.optimizer.SubmarineSparkRangerAuthorizationExtension.apply(SubmarineSparkRangerAuthorizationExtension.scala:65)
           at 
org.apache.spark.sql.catalyst.optimizer.SubmarineSparkRangerAuthorizationExtension.apply(SubmarineSparkRangerAuthorizationExtension.scala:40)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216)
           at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
           at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
           at scala.collection.immutable.List.foldLeft(List.scala:89)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205)
           at scala.collection.immutable.List.foreach(List.scala:392)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$optimizedPlan$1(QueryExecution.scala:87)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
           at 
org.apache.spark.sql.execution.QueryExecution.optimizedPlan$lzycompute(QueryExecution.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution.optimizedPlan(QueryExecution.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution.assertOptimized(QueryExecution.scala:95)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:113)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:110)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$simpleString$2(QueryExecution.scala:161)
           at 
org.apache.spark.sql.execution.ExplainUtils$.processPlan(ExplainUtils.scala:115)
           at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:161)
           at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:206)
           at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:175)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:98)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:92)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to