HeartSaVioR commented on a change in pull request #31638:
URL: https://github.com/apache/spark/pull/31638#discussion_r611302985
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1681,6 +1681,15 @@ object SQLConf {
.timeConf(TimeUnit.MILLISECONDS)
.createWithDefault(TimeUnit.MINUTES.toMillis(10)) // 10 minutes
+ val FILE_SINK_FORMAT_CHECK_ENABLED =
Review comment:
I'm not in favor of having flags and would like to avoid adding flag at
all. When we add a flag, we are adding "branch" to maintenance. The purpose of
this PR is to "fix" regression brought in Spark 3.0.0 - do we really want to
retain the current behavior even we define the current behavior as
"regression"? If then, can we make clear both behaviors have valid use cases?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]