jeesou commented on issue #11440:
URL: https://github.com/apache/iceberg/issues/11440#issuecomment-4132616400

   Hey @nastra @SNgari is this issue still there
   because I am on iceberg 1.10.0 release
   
   and i am facing issues with views as well
   my configs :
   
   `spark = SparkSession\
           .builder\
           .appName("SparkDataCreation")\
           .config("spark.sql.extensions", 
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")\
           .config("spark.sql.catalog.cat_iceberg", 
"org.apache.iceberg.spark.SparkCatalog")\
           .config("spark.sql.catalog.cat_iceberg.type", "hive")\
           .config("spark.sql.catalog.cat_iceberg.warehouse", "s3a://dev-cat")
           .enableHiveSupport()\
           .getOrCreate()`
   
   the queries I am running
   
   ```
   spark.sql(f"USE cat_iceberg.mydb").show()
   
   spark.sql(f"CREATE VIEW iceberg_view AS SELECT * FROM 
spark_iceberg_table").show()
   
   spark.sql(f"SHOW VIEWS FROM cat_iceberg.mydb").show()
   
   spark.sql(f"DROP VIEW IF EXISTS cat_iceberg.mydb.iceberg_view_new")
   ```
   
   I cannot list any view created its coming under tables
   
   ```
   show views ->
   +---------+--------+-----------+
   |namespace|viewName|isTemporary|
   +---------+--------+-----------+
   +---------+--------+-----------+
   
   show tables -> 
   +---------+--------------------+-----------+
   |namespace|           tableName|isTemporary|
   +---------+--------------------+-----------+
   |     mydb|        iceberg_view|      false|
   |     mydb| spark_iceberg_table|      false|
   |     mydb|       iceberg_view1|      false|
   +---------+--------------------+-----------+
   ```
   
   Trying to drop the view gives error : 
   
   ```
   Py4JJavaError: An error occurred while calling o1648.sql.
   : org.apache.iceberg.exceptions.NoSuchViewException: View does not exist: 
cat_iceberg.mydb.iceberg_view
        at 
org.apache.iceberg.hive.HiveOperationsBase.validateIcebergTableNotLoadedAsIcebergView(HiveOperationsBase.java:132)
        at 
org.apache.iceberg.hive.HiveViewOperations.doRefresh(HiveViewOperations.java:95)
        at 
org.apache.iceberg.view.BaseViewOperations.refresh(BaseViewOperations.java:89)
        at 
org.apache.iceberg.view.BaseViewOperations.current(BaseViewOperations.java:79)
        at org.apache.iceberg.hive.HiveCatalog.dropView(HiveCatalog.java:279)
        at org.apache.iceberg.spark.SparkCatalog.dropView(SparkCatalog.java:705)
        at 
org.apache.spark.sql.execution.datasources.v2.DropV2ViewExec.run(DropV2ViewExec.scala:37)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
        at 
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461)
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437)
        at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98)
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85)
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220)
        at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
        at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:638)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:629)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:659)
        at jdk.internal.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:575)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
        at py4j.Gateway.invoke(Gateway.java:282)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
        at java.base/java.lang.Thread.run(Thread.java:854)
   ```
   
   An alter query like
   
   ```
   spark.sql(f"""
           ALTER VIEW cat_iceberg.mydb.iceberg_view
           RENAME TO cat_iceberg.mydb.iceberg_view_new
       """).show()
   ```
   
   gives error
   
   ```
   26/03/26 08:20:34 WARN Tasks: Retrying task after failure: sleepTimeMs=100 
Cannot parse missing int: last-column-id
   java.lang.IllegalArgumentException: Cannot parse missing int: last-column-id
        at 
org.apache.iceberg.relocated.com.google.common.base.Preconditions.checkArgument(Preconditions.java:217)
        at org.apache.iceberg.util.JsonUtil.getInt(JsonUtil.java:117)
        at 
org.apache.iceberg.TableMetadataParser.fromJson(TableMetadataParser.java:365)
        at 
org.apache.iceberg.TableMetadataParser.fromJson(TableMetadataParser.java:339)
        at 
org.apache.iceberg.TableMetadataParser.read(TableMetadataParser.java:309)
        at 
org.apache.iceberg.TableMetadataParser.read(TableMetadataParser.java:294)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.lambda$refreshFromMetadataLocation$0(BaseMetastoreTableOperations.java:180)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.lambda$refreshFromMetadataLocation$1(BaseMetastoreTableOperations.java:199)
        at 
org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413)
        at 
org.apache.iceberg.util.Tasks$Builder.runSingleThreaded(Tasks.java:219)
        at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:203)
        at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:196)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.refreshFromMetadataLocation(BaseMetastoreTableOperations.java:199)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.refreshFromMetadataLocation(BaseMetastoreTableOperations.java:176)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.refreshFromMetadataLocation(BaseMetastoreTableOperations.java:171)
        at 
org.apache.iceberg.hive.HiveTableOperations.doRefresh(HiveTableOperations.java:131)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.refresh(BaseMetastoreTableOperations.java:88)
        at 
org.apache.iceberg.BaseMetastoreTableOperations.current(BaseMetastoreTableOperations.java:71)
        at 
org.apache.iceberg.BaseMetastoreCatalog.loadTable(BaseMetastoreCatalog.java:49)
        at 
org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
        at 
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1924)
        at 
org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
        at 
org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
        at 
org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
        at 
org.apache.iceberg.shaded.com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
        at org.apache.iceberg.CachingCatalog.loadTable(CachingCatalog.java:147)
        at org.apache.iceberg.spark.SparkCatalog.load(SparkCatalog.java:846)
        at 
org.apache.iceberg.spark.SparkCatalog.loadTable(SparkCatalog.java:170)
        at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.getTable(CatalogV2Util.scala:363)
        at 
org.apache.spark.sql.connector.catalog.CatalogV2Util$.loadTable(CatalogV2Util.scala:337)
        at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$lookupTableOrView$2(Analyzer.scala:1228)
        at scala.Option.orElse(Option.scala:447)
   ```
   
   Do you have any idea on this?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to