Github user mn-mikke commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21620#discussion_r197716236
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
 ---
    @@ -536,6 +536,11 @@ object TypeCoercion {
               case None => c
             }
     
    +      case ArrayJoin(arr, d, nr) if 
!ArrayType(StringType).acceptsType(arr.dataType) &&
    +        ArrayType.acceptsType(arr.dataType) =>
    +        val containsNull = 
arr.dataType.asInstanceOf[ArrayType].containsNull
    +        ArrayJoin(Cast(arr, ArrayType(StringType, containsNull)), d, nr)
    --- End diff --
    
    Hi @mgaido91,
    to be honest, I've considered this option before submitting this PR. But 
I'm glad that you mentioned this approach. At least, we can discuss pros and 
cons of different solutions. Usage of ```ImplicitTypeCasts.implicitCast``` 
would enable conversion only from primitive types. I think it would be nice to 
support non-primitive types as well. WDYT?
    
    Re: Casting to ```StringType```: According to ```Cast.canCast``` method 
should be possible to cast any type to ```StringType```:
    **line 42:** ```   case (_, StringType) => true```
    Or am I missing something? I hope test cases in 
*.../typeCoercion/native/arrayJoin.sql* cover to ```StringType``` conversions 
from all Spark types.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to