dtenedor commented on code in PR #37256:
URL: https://github.com/apache/spark/pull/37256#discussion_r929421727
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -2919,6 +2919,17 @@ object SQLConf {
.stringConf
.createWithDefault("csv,json,orc,parquet")
+ val ADD_DEFAULT_COLUMN_EXISTING_TABLE_BANNED_PROVIDERS =
+ buildConf("spark.sql.defaultColumn.addColumnExistingTableBannedProviders")
+ .internal()
+ .doc("List of table providers wherein SQL commands are NOT permitted to
assign DEFAULT " +
+ "values to new columns in existing tables, such as when using the
ALTER TABLE ... " +
+ "ADD COLUMNS command in SQL. Comma-separated list, whitespace ignored,
case-insensitive.")
Review Comment:
I checked, it looks like not currently: today we support integers,
floating-point numbers, booleans, strings, time values, and byte buffers.
I hear your concern and definitely open to ideas though :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]