xingchen1997 opened a new issue, #3598:
URL: https://github.com/apache/amoro/issues/3598

   ### What happened?
   
    Refer to the official Amoro documentation, link 
https://amoro.apache.org/docs/0.7.1/spark-getting-started/, Spark Getting 
Started section, encountered an issue with 
**“org.apache.amoro.shade.org.apache.iceberg.exceptions.NoSuchTableException:”.**
   
   Configuration of **spark** catalog:
   
   <img width="1151" alt="Image" 
src="https://github.com/user-attachments/assets/8d19fb03-de46-4afd-a43a-3d51cbe4f674";
 />
   
   
   **logs is:**
   AMS_HOST=127.0.0.1
   AMS_PORT=1260
   AMS_CATALOG_NAME=spark
   
   ${SPARK_HOME}/bin/spark-sql \
       --conf 
spark.sql.extensions=org.apache.amoro.spark.MixedFormatSparkExtensions \
       --conf 
spark.sql.catalog.spark=org.apache.amoro.spark.MixedFormatSparkCatalog \
       --conf 
spark.sql.catalog.spark.url=thrift://${AMS_HOST}:${AMS_PORT}/${AMS_CATALOG_NAME}
   
   spark-sql> use spark;
   Time taken: 0.018 seconds
   spark-sql> create database if not exists test_db5;
   Time taken: 0.041 seconds
   spark-sql> use test_db5;
   Time taken: 0.023 seconds
   spark-sql> create table spark. test_db5.test5 (id int, data string, ts 
timestamp) using mixed_iceberg;
   2025-06-05 13:04:56 ERROR SparkSQLDriver: Failed in [create table spark. 
test_db5.test5 (id int, data string, ts timestamp) using mixed_iceberg]
   org.apache.amoro.shade.org.apache.iceberg.exceptions.NoSuchTableException: 
spark.test_db5.test5 not exists
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:103)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:93)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:183)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:292)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:226)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.get(HTTPClient.java:327)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTClient.get(RESTClient.java:96)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTSessionCatalog.loadInternal(RESTSessionCatalog.java:300)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTSessionCatalog.loadTable(RESTSessionCatalog.java:316)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.loadTable(BaseSessionCatalog.java:99)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTCatalog.loadTable(RESTCatalog.java:96)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916) 
~[?:?]
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.shade.org.apache.iceberg.CachingCatalog.loadTable(CachingCatalog.java:166)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.mixed.BasicMixedIcebergCatalog.lambda$loadTable$5(BasicMixedIcebergCatalog.java:150)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.table.TableMetaStore.call(TableMetaStore.java:256) 
~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:231) 
~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
java.security.AccessController.doPrivileged(AccessController.java:399) ~[?:?]
           at javax.security.auth.Subject.doAs(Subject.java:376) ~[?:?]
           at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1855)
 ~[hadoop-client-api-3.3.2.jar:?]
           at 
org.apache.amoro.table.TableMetaStore.doAs(TableMetaStore.java:231) 
~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.mixed.BasicMixedIcebergCatalog.loadTable(BasicMixedIcebergCatalog.java:149)
 ~[amoro-mixed-format-spark-runtime-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.amoro.spark.MixedFormatSparkCatalog.loadTable(MixedFormatSparkCatalog.java:83)
 ~[amoro-mixed-format-spark-3.3-0.7.1-incubating.jar:0.7.1-incubating]
           at 
org.apache.spark.sql.connector.catalog.TableCatalog.tableExists(TableCatalog.java:156)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:43)
 ~[spark-sql_2.12-3.3.0.jar:0.7.1-incubating]
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
 ~[spark-sql_2.12-3.3.0.jar:0.7.1-incubating]
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
 ~[spark-sql_2.12-3.3.0.jar:0.7.1-incubating]
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
 ~[spark-sql_2.12-3.3.0.jar:0.7.1-incubating]
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:98)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:584)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
 ~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:560) 
~[spark-catalyst_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79)
 ~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:651) 
~[spark-sql_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:384)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:504)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:498)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at scala.collection.Iterator.foreach(Iterator.scala:943) 
~[scala-library-2.12.15.jar:?]
           at scala.collection.Iterator.foreach$(Iterator.scala:943) 
~[scala-library-2.12.15.jar:?]
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) 
~[scala-library-2.12.15.jar:?]
           at scala.collection.IterableLike.foreach(IterableLike.scala:74) 
~[scala-library-2.12.15.jar:?]
           at scala.collection.IterableLike.foreach$(IterableLike.scala:73) 
~[scala-library-2.12.15.jar:?]
           at scala.collection.AbstractIterable.foreach(Iterable.scala:56) 
~[scala-library-2.12.15.jar:?]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:498)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:286)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 ~[spark-hive-thriftserver_2.12-3.3.0.jar:3.3.0]
           at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) ~[?:?]
           at 
jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
 ~[?:?]
           at 
jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:?]
           at java.lang.reflect.Method.invoke(Method.java:569) ~[?:?]
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
 ~[spark-core_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
~[spark-core_2.12-3.3.0.jar:3.3.0]
   org.apache.amoro.shade.org.apache.iceberg.exceptions.NoSuchTableException: 
spark.test_db5.test5 not exists
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:103)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:93)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:183)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:292)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:226)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.HTTPClient.get(HTTPClient.java:327)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTClient.get(RESTClient.java:96)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTSessionCatalog.loadInternal(RESTSessionCatalog.java:300)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTSessionCatalog.loadTable(RESTSessionCatalog.java:316)
           at 
org.apache.amoro.shade.org.apache.iceberg.catalog.BaseSessionCatalog$AsCatalog.loadTable(BaseSessionCatalog.java:99)
           at 
org.apache.amoro.shade.org.apache.iceberg.rest.RESTCatalog.loadTable(RESTCatalog.java:96)
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
           at 
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
           at 
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
           at 
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
           at 
com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
           at 
org.apache.amoro.shade.org.apache.iceberg.CachingCatalog.loadTable(CachingCatalog.java:166)
           at 
org.apache.amoro.mixed.BasicMixedIcebergCatalog.lambda$loadTable$5(BasicMixedIcebergCatalog.java:150)
           at 
org.apache.amoro.table.TableMetaStore.call(TableMetaStore.java:256)
           at 
org.apache.amoro.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:231)
           at 
java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
           at java.base/javax.security.auth.Subject.doAs(Subject.java:376)
           at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1855)
           at 
org.apache.amoro.table.TableMetaStore.doAs(TableMetaStore.java:231)
           at 
org.apache.amoro.mixed.BasicMixedIcebergCatalog.loadTable(BasicMixedIcebergCatalog.java:149)
           at 
org.apache.amoro.spark.MixedFormatSparkCatalog.loadTable(MixedFormatSparkCatalog.java:83)
           at 
org.apache.spark.sql.connector.catalog.TableCatalog.tableExists(TableCatalog.java:156)
           at 
org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:43)
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
           at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:98)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:584)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:560)
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617)
           at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:651)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:384)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:504)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:498)
           at scala.collection.Iterator.foreach(Iterator.scala:943)
           at scala.collection.Iterator.foreach$(Iterator.scala:943)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
           at scala.collection.IterableLike.foreach(IterableLike.scala:74)
           at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:498)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:286)
           at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.base/java.lang.reflect.Method.invoke(Method.java:569)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   
   ### Affects Versions
   
   0.7.1
   
   ### What table formats are you seeing the problem on?
   
   _No response_
   
   ### What engines are you seeing the problem on?
   
   _No response_
   
   ### How to reproduce
   
   _No response_
   
   ### Relevant log output
   
   ```shell
   
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's Code of Conduct


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to