Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21074#discussion_r181619518
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
---
@@ -178,7 +178,13 @@ object TypeCoercion {
private def findWiderCommonType(types: Seq[DataType]): Option[DataType]
= {
types.foldLeft[Option[DataType]](Some(NullType))((r, c) => r match {
case Some(d) => findWiderTypeForTwo(d, c)
- case None => None
+ // Currently we find the wider common type by comparing the two
types from left to right,
--- End diff --
The real problem is, `findWiderTypeForTwo` doesn't satisfy the associative
law, i.e. `(a op b) op c` may not equal to `a op (b op c)`. I think
`StringType` is the only exception here, it's more clear to do
```
val (stringType, nonStringType) = types.partition(_ == StringType)
(stringType.distinct ++ nonStringType).foldLeft...
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]