dtenedor commented on code in PR #37256:
URL: https://github.com/apache/spark/pull/37256#discussion_r930162598


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -2919,6 +2919,17 @@ object SQLConf {
       .stringConf
       .createWithDefault("csv,json,orc,parquet")
 
+  val ADD_DEFAULT_COLUMN_EXISTING_TABLE_BANNED_PROVIDERS =
+    buildConf("spark.sql.defaultColumn.addColumnExistingTableBannedProviders")
+      .internal()
+      .doc("List of table providers wherein SQL commands are NOT permitted to 
assign DEFAULT " +
+        "values to new columns in existing tables, such as when using the 
ALTER TABLE ... " +
+        "ADD COLUMNS command in SQL. Comma-separated list, whitespace ignored, 
case-insensitive.")

Review Comment:
   These are good points. It is not ideal to duplicate the table provider 
strings in two SQLConf entries.
   
   @gengliangwang and @amaliujia what do you think about simply reusing 
`DEFAULT_COLUMN_ALLOWED_PROVIDERS` SQLConf and adding some simple rule to mark 
a table provider as "does not support `ALTER TABLE ADD COLUMN` commands"? For 
example, we could allow an asterisk after the table provider name to indicate 
this. For example, we could have `"csv,json,orc,parquet,myv2datasource*"` as 
the SQLConf value for a new V2 data source.
   
   Edit: I went ahead and implemented this change, the logic is simpler now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to