cloud-fan commented on code in PR #39942:
URL: https://github.com/apache/spark/pull/39942#discussion_r1102410248


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -3091,12 +3091,11 @@ object SQLConf {
     buildConf("spark.sql.defaultColumn.allowedProviders")
       .internal()
       .doc("List of table providers wherein SQL commands are permitted to 
assign DEFAULT column " +
-        "values. Comma-separated list, whitespace ignored, case-insensitive. 
If an asterisk " +
-        "appears after any table provider in this list, any command may assign 
DEFAULT column " +
-        "except `ALTER TABLE ... ADD COLUMN`. Otherwise, if no asterisk 
appears, all commands " +

Review Comment:
   I don't think this is reasonable to allow a builtin data source to partially 
support column default value. Custom v2 catalogs can reject add columns by 
themselves in `TableCatalog.alterTable`, and builtin files sources must fully 
support default column as add column is not the only issue. People can do 
`CREATE TABLE t (with default value) USING parquet LOCATION ...` and the data 
files do not have the columns with default value.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to