pan3793 commented on code in PR #54517:
URL: https://github.com/apache/spark/pull/54517#discussion_r2876944104


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -4754,6 +4743,16 @@ object SQLConf {
       .enumConf(StoreAssignmentPolicy)
       .createWithDefault(StoreAssignmentPolicy.ANSI)
 
+  val FILE_SOURCE_INSERT_ENFORCE_NOT_NULL =
+    buildConf("spark.sql.fileSource.insert.enforceNotNull")

Review Comment:
   ```suggestion
       buildConf("spark.sql.files.insert.enforceNotNull")
   ```
   
   Can we follow the existing config namespace? Currently, we have many 
`spark.sql.files.*` configs which are "effective only when using file-based 
sources"
   
   ```
   spark.sql.files.maxPartitionBytes
   spark.sql.files.openCostInBytes
   spark.sql.files.minPartitionNum
   spark.sql.files.maxPartitionNum
   spark.sql.files.ignoreMissingFiles
   ...
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to