Github user maropu commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21704#discussion_r200025173
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
 ---
    @@ -2007,7 +2007,14 @@ case class Concat(children: Seq[Expression]) extends 
Expression {
         }
       }
     
    -  override def dataType: DataType = 
children.map(_.dataType).headOption.getOrElse(StringType)
    +  override def dataType: DataType = {
    +    val dataTypes = children.map(_.dataType)
    +    dataTypes.headOption.map {
    +      case ArrayType(et, _) =>
    +        ArrayType(et, 
dataTypes.exists(_.asInstanceOf[ArrayType].containsNull))
    +      case dt => dt
    +    }.getOrElse(StringType)
    +  }
    --- End diff --
    
    Aha, I see. But,  I just have a hunch that `SimplifyCasts` cannot simplify 
array casts in some cases?, e.g., this concat case.  Since we basically cannot 
change semantics in optimization phase, I feel a little weird about this 
simplification.
    
    Anyway, I'm ok with your approach because I can't find a better & simpler 
way to solve this in analysis phase... Thanks!


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to