xuanyuanking commented on a change in pull request #27478: [SPARK-25829][SQL] 
Add config `spark.sql.legacy.allowDuplicatedMapKeys` and change the default 
behavior
URL: https://github.com/apache/spark/pull/27478#discussion_r379802575
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
 ##########
 @@ -2167,6 +2167,16 @@ object SQLConf {
     .booleanConf
     .createWithDefault(false)
 
+  val LEGACY_ALLOW_DUPLICATED_MAP_KEY =
+    buildConf("spark.sql.legacy.allowDuplicatedMapKeys")
+      .doc("When true, use last wins policy to remove duplicated map keys in 
built-in functions, " +
+        "this config takes effect in below build-in functions: CreateMap, 
MapFromArrays, " +
+        "MapFromEntries, StringToMap, MapConcat and TransformKeys. Otherwise, 
if this is false, " +
+        "which is the default, Spark will throw an exception while duplicated 
map keys are " +
 
 Review comment:
   Thanks, done in b102c36.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to