dtenedor commented on code in PR #36212:
URL: https://github.com/apache/spark/pull/36212#discussion_r854485737
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -2858,15 +2858,15 @@ object SQLConf {
.createWithDefault(true)
val USE_NULLS_FOR_MISSING_DEFAULT_COLUMN_VALUES =
- buildConf("spark.sql.defaultColumn.useNullsForMissingDefautValues")
+ buildConf("spark.sql.defaultColumn.useNullsForMissingDefaultValues")
.internal()
.doc("When true, and DEFAULT columns are enabled, allow column
definitions lacking " +
"explicit default values to behave as if they had specified DEFAULT
NULL instead. " +
"For example, this allows most INSERT INTO statements to specify only
a prefix of the " +
"columns in the target table, and the remaining columns will receive
NULL values.")
.version("3.4.0")
.booleanConf
- .createWithDefault(false)
+ .createWithDefault(true)
Review Comment:
Update on this: we talked a bit and decided it would be best to keep this
config false by default. We can retain the test coverage but leave the existing
OSS Spark behavior as-is. Thanks for discussing the details for this.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]