danhuawang opened a new issue, #10204:
URL: https://github.com/apache/gravitino/issues/10204

   ### Version
   
   main branch
   
   ### Describe what's wrong
   
   ```
   wangdanhua@wangdanhuadeMBP bin % ./spark-sql --jars 
"/Users/wangdanhua/Workspace/spark-bin-irc/jars/iceberg-aws-bundle-1.10.0.jar,/Users/wangdanhua/Workspace/spark-bin-irc/jars/iceberg-spark-runtime-3.4_2.12-1.10.0.jar"
 --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
 --conf spark.sql.catalog.rest=org.apache.iceberg.spark.SparkCatalog --conf 
spark.sql.catalog.rest.type=rest --conf 
spark.sql.catalog.rest.uri=http://127.0.0.1:19001/iceberg/ --conf 
spark.sql.catalog.rest.header.X-Iceberg-Access-Delegation=vended-credentials 
--conf spark.sql.catalog.rest.cache-enabled=false --conf 
spark.sql.catalog.rest.rest.auth.type=basic --conf 
   spark.sql.catalog.rest.rest.auth.basic.username=sparkuser1 --conf 
spark.sql.catalog.rest.rest.auth.basic.password=mock --conf 
spark.locality.wait.node=0
   WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will 
impact performance.
   26/03/04 19:43:23 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   26/03/04 19:43:24 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   26/03/04 19:43:24 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   26/03/04 19:43:25 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 2.3.0
   26/03/04 19:43:25 WARN ObjectStore: setMetaStoreSchemaVersion called but 
recording version is disabled: version = 2.3.0, comment = Set by MetaStore 
[email protected]
   Spark master: local[*], Application Id: local-1772624604079
   spark-sql (default)> use rest;
   26/03/04 19:43:29 WARN ObjectStore: Failed to get database global_temp, 
returning NoSuchObjectException
   Time taken: 0.788 seconds
   spark-sql ()> CREATE VIEW viewdb.test_spark_view AS SELECT * FROM 
viewdb.spark_test;
   26/03/04 19:44:24 ERROR SparkSQLDriver: Failed in [CREATE VIEW 
viewdb.test_spark_view AS SELECT * FROM viewdb.spark_test]
   org.apache.iceberg.exceptions.ServiceFailureException: Server error: 
RuntimeException: Authorization failed due to system internal error, User: 
'sparkuser1', Operation: 'loadTable'
           at 
org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:243)
           at 
org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:124)
           at 
org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:108)
           at 
org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:240)
           at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:336)
           at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:297)
           at org.apache.iceberg.rest.BaseHTTPClient.get(BaseHTTPClient.java:77)
           at 
org.apache.iceberg.rest.RESTSessionCatalog.loadInternal(RESTSessionCatalog.java:375)
           at 
org.apache.iceberg.rest.RESTSessionCatalog.loadTable(RESTSessionCatalog.java:399)
           at 
org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.loadTable(BaseSessionCatalog.java:99)
           at 
org.apache.iceberg.rest.RESTCatalog.loadTable(RESTCatalog.java:106)
           at org.apache.iceberg.spark.SparkCatalog.load(SparkCatalog.java:846)
           at 
org.apache.iceberg.spark.SparkCatalog.loadTable(SparkCatalog.java:170)
           at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.getTable(CatalogV2Util.scala:355)
           at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.loadTable(CatalogV2Util.scala:336)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$3(Analyzer.scala:1264)
           at scala.Option.orElse(Option.scala:447)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$1(Analyzer.scala:1263)
           at scala.Option.orElse(Option.scala:447)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$resolveRelation(Analyzer.scala:1255)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1118)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1082)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:138)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:138)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$2(AnalysisHelper.scala:135)
           at 
org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1249)
           at 
org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1248)
           at 
org.apache.spark.sql.catalyst.plans.logical.Project.mapChildren(basicLogicalOperators.scala:69)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:135)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$2(AnalysisHelper.scala:135)
           at 
org.apache.spark.sql.catalyst.trees.BinaryLike.mapChildren(TreeNode.scala:1277)
           at 
org.apache.spark.sql.catalyst.trees.BinaryLike.mapChildren$(TreeNode.scala:1274)
           at 
org.apache.spark.sql.catalyst.plans.logical.views.CreateIcebergView.mapChildren(CreateIcebergView.scala:25)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:135)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:134)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:130)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:1082)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:1041)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:222)
           at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
           at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
           at scala.collection.immutable.List.foldLeft(List.scala:91)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:219)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:211)
           at scala.collection.immutable.List.foreach(List.scala:431)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:211)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:228)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:224)
           at 
org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:173)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:224)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:188)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:182)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
           at 
org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:182)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:209)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:330)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:208)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:76)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:202)
           at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:526)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:202)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
           at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:201)
           at 
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:76)
           at 
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:74)
           at 
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:66)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:640)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:630)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:671)
           at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:651)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:415)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:533)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:527)
           at scala.collection.Iterator.foreach(Iterator.scala:943)
           at scala.collection.Iterator.foreach$(Iterator.scala:943)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
           at scala.collection.IterableLike.foreach(IterableLike.scala:74)
           at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:527)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:307)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Server error: RuntimeException: Authorization failed due to system internal 
error, User: 'sparkuser1', Operation: 'loadTable'
   org.apache.iceberg.exceptions.ServiceFailureException: Server error: 
RuntimeException: Authorization failed due to system internal error, User: 
'sparkuser1', Operation: 'loadTable'
           at 
org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:243)
           at 
org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:124)
           at 
org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:108)
           at 
org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:240)
           at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:336)
           at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:297)
           at org.apache.iceberg.rest.BaseHTTPClient.get(BaseHTTPClient.java:77)
           at 
org.apache.iceberg.rest.RESTSessionCatalog.loadInternal(RESTSessionCatalog.java:375)
           at 
org.apache.iceberg.rest.RESTSessionCatalog.loadTable(RESTSessionCatalog.java:399)
           at 
org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.loadTable(BaseSessionCatalog.java:99)
           at 
org.apache.iceberg.rest.RESTCatalog.loadTable(RESTCatalog.java:106)
           at org.apache.iceberg.spark.SparkCatalog.load(SparkCatalog.java:846)
           at 
org.apache.iceberg.spark.SparkCatalog.loadTable(SparkCatalog.java:170)
           at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.getTable(CatalogV2Util.scala:355)
           at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.loadTable(CatalogV2Util.scala:336)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$3(Analyzer.scala:1264)
           at scala.Option.orElse(Option.scala:447)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$resolveRelation$1(Analyzer.scala:1263)
           at scala.Option.orElse(Option.scala:447)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$resolveRelation(Analyzer.scala:1255)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1118)
           at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$14.applyOrElse(Analyzer.scala:1082)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:138)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:138)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHel
   ```
   
   ### Error message and/or stacktrace
   
   ```
   2026-03-04 19:43:57.610 WARN [iceberg-rest-50] 
[org.apache.gravitino.iceberg.service.IcebergExceptionMapper.toRESTResponse(IcebergExceptionMapper.java:84)]
 - Iceberg REST server unexpected exception:
   java.lang.RuntimeException: Authorization failed due to system internal 
error, User: 'sparkuser1', Operation: 'loadTable'
        at 
org.apache.gravitino.server.web.filter.BaseMetadataAuthorizationMethodInterceptor.invoke(BaseMetadataAuthorizationMethodInterceptor.java:200)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        at 
org.jvnet.hk2.internal.MethodInterceptorHandler.invoke(MethodInterceptorHandler.java:97)
 ~[hk2-locator-2.6.1.jar:?]
        at 
org.apache.gravitino.iceberg.service.rest.IcebergTableOperations_$$_jvstcf9_1.loadTable(IcebergTableOperations_$$_jvstcf9_1.java)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        at jdk.internal.reflect.GeneratedMethodAccessor119.invoke(Unknown 
Source) ~[?:?]
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:?]
        at java.base/java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
        at 
org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:146)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:189)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:176)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:93)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:478)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:400)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:81)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:256) 
~[jersey-server-2.41.jar:?]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) 
~[jersey-common-2.41.jar:?]
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) 
~[jersey-common-2.41.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:292) 
~[jersey-common-2.41.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:274) 
~[jersey-common-2.41.jar:?]
        at org.glassfish.jersey.internal.Errors.process(Errors.java:244) 
~[jersey-common-2.41.jar:?]
        at 
org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
 ~[jersey-common-2.41.jar:?]
        at 
org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:235) 
~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:684)
 ~[jersey-server-2.41.jar:?]
        at 
org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394) 
~[jersey-container-servlet-core-2.41.jar:?]
        at 
org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) 
~[jersey-container-servlet-core-2.41.jar:?]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:358)
 ~[jersey-container-servlet-core-2.41.jar:?]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:311)
 ~[jersey-container-servlet-core-2.41.jar:?]
        at 
org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205)
 ~[jersey-container-servlet-core-2.41.jar:?]
        at 
org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799) 
~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1656)
 ~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.apache.gravitino.server.authentication.AuthenticationFilter.lambda$doFilter$0(AuthenticationFilter.java:89)
 ~[gravitino-server-common-1.2.0-SNAPSHOT.jar:?]
        at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
 ~[?:?]
        at java.base/javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
        at 
org.apache.gravitino.utils.PrincipalUtils.doAs(PrincipalUtils.java:44) 
~[gravitino-core-1.2.0-SNAPSHOT.jar:?]
        at 
org.apache.gravitino.server.authentication.AuthenticationFilter.doFilter(AuthenticationFilter.java:86)
 ~[gravitino-server-common-1.2.0-SNAPSHOT.jar:?]
        at 
org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) 
~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626)
 ~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) 
~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) 
~[jetty-servlet-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) 
~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
 ~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) 
~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at org.eclipse.jetty.server.Server.handle(Server.java:516) 
~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) 
~[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) 
[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479) 
[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) 
[jetty-server-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
 [jetty-io-9.4.51.v20230217.jar:9.4.51.v20230217]
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) 
[jetty-io-9.4.51.v20230217.jar:9.4.51.v20230217]
        at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) 
[jetty-io-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at 
org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
 [jetty-util-9.4.51.v20230217.jar:9.4.51.v20230217]
        at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
   Caused by: org.apache.iceberg.jdbc.UncheckedSQLException: Failed to get view 
viewdb.spark_test from catalog jdbc
        at 
org.apache.iceberg.jdbc.JdbcViewOperations.doRefresh(JdbcViewOperations.java:78)
 ~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseViewOperations.refresh(BaseViewOperations.java:89) 
~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseViewOperations.current(BaseViewOperations.java:79) 
~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseMetastoreViewCatalog.loadView(BaseMetastoreViewCatalog.java:60)
 ~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.catalog.ViewCatalog.viewExists(ViewCatalog.java:65) 
~[iceberg-api-1.10.1.jar:?]
        at 
org.apache.gravitino.iceberg.common.ops.IcebergCatalogWrapper.viewExists(IcebergCatalogWrapper.java:261)
 ~[gravitino-iceberg-common-1.2.0-SNAPSHOT.jar:?]
        at 
org.apache.gravitino.server.web.filter.LoadTableAuthzHandler.process(LoadTableAuthzHandler.java:122)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        at 
org.apache.gravitino.server.web.filter.BaseMetadataAuthorizationMethodInterceptor.invoke(BaseMetadataAuthorizationMethodInterceptor.java:158)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        ... 62 more
   Caused by: org.postgresql.util.PSQLException: This connection has been 
closed.
        at 
org.postgresql.jdbc.PgConnection.checkClosed(PgConnection.java:100s9) 
~[postgresql-42.7.2.jar:42.7.2]
        at 
org.postgresql.jdbc.PgConnection.prepareStatement(PgConnection.java:1859) 
~[postgresql-42.7.2.jar:42.7.2]
        at 
org.postgresql.jdbc.PgConnection.prepareStatement(PgConnection.java:534) 
~[postgresql-42.7.2.jar:42.7.2]
        at 
org.apache.iceberg.jdbc.JdbcUtil.lambda$tableOrView$2(JdbcUtil.java:607) 
~[iceberg-core-1.10.1.jar:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:72) 
~[iceberg-core-1.10.1.jar:?]
        at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:65) 
~[iceberg-core-1.10.1.jar:?]
        at org.apache.iceberg.jdbc.JdbcUtil.tableOrView(JdbcUtil.java:602) 
~[iceberg-core-1.10.1.jar:?]
        at org.apache.iceberg.jdbc.JdbcUtil.loadView(JdbcUtil.java:650) 
~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.jdbc.JdbcViewOperations.doRefresh(JdbcViewOperations.java:72)
 ~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseViewOperations.refresh(BaseViewOperations.java:89) 
~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseViewOperations.current(BaseViewOperations.java:79) 
~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.view.BaseMetastoreViewCatalog.loadView(BaseMetastoreViewCatalog.java:60)
 ~[iceberg-core-1.10.1.jar:?]
        at 
org.apache.iceberg.catalog.ViewCatalog.viewExists(ViewCatalog.java:65) 
~[iceberg-api-1.10.1.jar:?]
        at 
org.apache.gravitino.iceberg.common.ops.IcebergCatalogWrapper.viewExists(IcebergCatalogWrapper.java:261)
 ~[gravitino-iceberg-common-1.2.0-SNAPSHOT.jar:?]
        at 
org.apache.gravitino.server.web.filter.LoadTableAuthzHandler.process(LoadTableAuthzHandler.java:122)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        at 
org.apache.gravitino.server.web.filter.BaseMetadataAuthorizationMethodInterceptor.invoke(BaseMetadataAuthorizationMethodInterceptor.java:158)
 ~[gravitino-iceberg-rest-server-1.2.0-SNAPSHOT.jar:?]
        ... 62 more
   ```
   
   ### How to reproduce
   
   1. Create view through spark-sql with a user who is without  create_view 
permission. Expectation response code is 403  while some times response 500.
   ```
   ./spark-sql --jars 
"/Users/wangdanhua/Workspace/spark-bin-irc/jars/iceberg-aws-bundle-1.10.0.jar,/Users/wangdanhua/Workspace/spark-bin-irc/jars/iceberg-spark-runtime-3.4_2.12-1.10.0.jar"
 --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
 --conf spark.sql.catalog.rest=org.apache.iceberg.spark.SparkCatalog --conf 
spark.sql.catalog.rest.type=rest --conf 
spark.sql.catalog.rest.uri=http://127.0.0.1:19001/iceberg/ --conf 
spark.sql.catalog.rest.header.X-Iceberg-Access-Delegation=vended-credentials 
--conf spark.sql.catalog.rest.cache-enabled=false --conf 
spark.sql.catalog.rest.rest.auth.type=basic --conf 
   spark.sql.catalog.rest.rest.auth.basic.username=sparkuser1 --conf 
spark.sql.catalog.rest.rest.auth.basic.password=mock --conf 
spark.locality.wait.node=0
   ```
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to