Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/18714#discussion_r128921212
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -881,6 +881,17 @@ object SQLConf {
.intConf
.createWithDefault(10000)
+ val RUNTIME_PARTITION_OVERWRITE =
+ buildConf("spark.sql.runtimePartitionOverwrite")
+ .doc("When insert overwrite a partitioned table with dynamic and
mixed partition columns, " +
+ "Spark will overwrite partition directories at runtime if this
config is enabled, i.e., " +
+ "only delete partition directories which have data written into it
during the insertion. " +
+ "By default this config is disabled to keep the previous behavior,
which means Spark " +
+ "will delete all partition directories that match the static
partition values provided " +
+ "in the insert statement.")
+ .booleanConf
+ .createWithDefault(false)
--- End diff --
CC @gatorsmile I decided not to try it, because this config only take
effect when overwriting partitioned table with dynamic partition columns, and
this config will change the behavior and fail all the related tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]