amogh-jahagirdar opened a new pull request, #13464:
URL: https://github.com/apache/iceberg/pull/13464

   Currently, ALTER TABLE ADD COLUMN with default values is unsupported, 
however the DDL will succeed and silently ignore setting the default value in 
Iceberg metadata.  There is an ongoing PR for 
[supporting](https://github.com/apache/iceberg/pull/13107)  this but there's 
still more work to be done on that.  
   
   It'd be ideal at least in the interim if we can explicitly surface an 
unsupported operation exception to users when the default value is specified . 
   
   Note:
   
   1. Create table with default values will already fail in Spark since the 
SparkCatalog/SparkSessionCatalog already don't surface default values as a 
capability 
   2. SparkSessionCatalog with alter table add column will also already fail in 
Spark 3.4/3.5 because the analyzer tries to set 
[this](https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala#L633)
 as the delegating catalog during analysis of the default values and we fail 
https://github.com/apache/iceberg/blob/main/spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/SparkSessionCatalog.java#L364
 . It's not as explicit of an error message but it's something that was more of 
an issue in spark itself so working around that for a clearer message doesn't 
make sense at least considering we are working on supporting it anyways. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to