Github user maropu commented on the issue:
https://github.com/apache/spark/pull/20858
The current code can't handle inner arrays;
```
scala> sql("select concat_arrays(array(1, 2, array(3, 4)), array(5, 6, 7,
8))").show
org.apache.spark.sql.AnalysisException: cannot resolve 'array(1, 2,
array(3, 4))' due to data type mismatch: input to function array should all be
the same type, but it's [int, int, array<int>]; line 1 pos 21;
'Project [unresolvedalias('concat_arrays(array(1, 2, array(3, 4)), array(5,
6, 7, 8)), None)]
+- OneRowRelation
```
IMHO, it's better to make this function behaviour the same with postgresql:
https://www.postgresql.org/docs/10/static/functions-array.html
Could you brush up code to handle this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]