Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/22017#discussion_r209420042
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
---
@@ -231,6 +231,15 @@ object TypeCoercion {
})
}
+ /**
+ * Similar to [[findTightestCommonType]] but with string promotion.
+ */
+ def findWiderTypeForTwoExceptDecimals(t1: DataType, t2: DataType):
Option[DataType] = {
--- End diff --
Ah, I see, good catch! But it led me to another issue. We can't choose
those types possibly to be null as a map key. Instead of adding the method, how
about modifying `findTypeForComplex` as something like:
```scala
private def findTypeForComplex(
t1: DataType,
t2: DataType,
findTypeFunc: (DataType, DataType) => Option[DataType]):
Option[DataType] = (t1, t2) match {
...
case (MapType(kt1, vt1, valueContainsNull1), MapType(kt2, vt2,
valueContainsNull2)) =>
findTypeFunc(kt1, kt2)
.filter(kt => !Cast.forceNullable(kt1, kt) &&
!Cast.forceNullable(kt2, kt))
.flatMap { kt =>
findTypeFunc(vt1, vt2).map { vt =>
MapType(kt, vt, valueContainsNull1 || valueContainsNull2)
}
}
...
}
```
We might need to have another pr to discuss this.
cc @cloud-fan @gatorsmile
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]