dtenedor commented on code in PR #37430:
URL: https://github.com/apache/spark/pull/37430#discussion_r943829643
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -2935,6 +2935,16 @@ object SQLConf {
.booleanConf
.createWithDefault(false)
+ val ADD_DEFAULTS_FOR_INSERTS_WITHOUT_USER_SPECIFIED_FIELDS =
+
buildConf("spark.sql.defaultColumn.addDefaultsForInsertsWithoutUserSpecifiedFields")
+ .internal()
+ .doc("When true, for each INSERT command without any user-specified
fields where the " +
Review Comment:
Good questions:
* The `user-specified fields` are explicit columns after the table in the
FROM clause, e.g. when the command looks like `INSERT INTO mytable(columnA,
columnB) VALUES (...))`.
* If `spark.sql.defaultColumn.useNullsForMissingDefaultValues` is true and
one of these missing columns does not have an explicit DEFAULT value, then NULL
will be appended. Otherwise, the command will fail with an error message
reporting that the number of columns provided is incorrect.
I added this information to the config description here.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]