Github user maropu commented on the issue:

    https://github.com/apache/spark/pull/20858
  
    We should handle different (and compatible) typed arrays in this funs?
    ```
    scala> sql("select concat_arrays(array(1L, 2L), array(3, 4))").show
    org.apache.spark.sql.AnalysisException: cannot resolve 
'concat_arrays(array(1L, 2L), array(3, 4))' due to data type mismatch: input to 
function concat_arrays sh
    ould all be the same type, but it's [array<bigint>, array<int>]; line 1 pos 
7;
    'Project [unresolvedalias(concat_arrays(array(1, 2), array(3, 4)), None)]
    +- OneRowRelation
    ```
    Also, could you add more tests for this case in `SQLQueryTestSuite`? 
probably, we can add a new test file like `concat_arrays.sql` in 
`typeCoercion.native`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to