Github user mn-mikke commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22017#discussion_r209876913
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/higherOrderFunctions.scala
 ---
    @@ -496,3 +496,194 @@ case class ArrayAggregate(
     
       override def prettyName: String = "aggregate"
     }
    +
    +/**
    + * Merges two given maps into a single map by applying function to the 
pair of values with
    + * the same key.
    + */
    +@ExpressionDescription(
    +  usage =
    +    """
    +      _FUNC_(map1, map2, function) - Merges two given maps into a single 
map by applying
    +      function to the pair of values with the same key. For keys only 
presented in one map,
    +      NULL will be passed as the value for the missing key. If an input 
map contains duplicated
    +      keys, only the first entry of the duplicated key is passed into the 
lambda function.
    +    """,
    +  examples = """
    +    Examples:
    +      > SELECT _FUNC_(map(1, 'a', 2, 'b'), map(1, 'x', 2, 'y'), (k, v1, 
v2) -> concat(v1, v2));
    +       {1:"ax",2:"by"}
    +  """,
    +  since = "2.4.0")
    +case class MapZipWith(left: Expression, right: Expression, function: 
Expression)
    +  extends HigherOrderFunction with CodegenFallback {
    +
    +  def functionForEval: Expression = functionsForEval.head
    +
    +  @transient lazy val MapType(leftKeyType, leftValueType, 
leftValueContainsNull) = left.dataType
    +
    +  @transient lazy val MapType(rightKeyType, rightValueType, 
rightValueContainsNull) = right.dataType
    +
    +  @transient lazy val keyType =
    +    TypeCoercion.findCommonTypeDifferentOnlyInNullFlags(leftKeyType, 
rightKeyType).get
    --- End diff --
    
    If ```leftKeyType``` is ```ArrayType(IntegerType, false)``` and 
```rightKeyType``` is ```ArrayType(IntegerType, true)``` for instance, the 
coercion rule is not executed ```leftKeyType.sameType(rightKeyType) == true```.
    
    An array with nulls seems to be a valid key.:
    ```
    scala> spark.range(1).selectExpr("map(array(1, 2, null), 12)").show()
    +---------------------------------------+
    |map(array(1, 2, CAST(NULL AS INT)), 12)|
    +---------------------------------------+
    |                        [[1, 2,] -> 12]|
    +---------------------------------------+
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to