[ 
https://issues.apache.org/jira/browse/SPARK-45078?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-45078:
-----------------------------------
    Labels: pull-request-available  (was: )

> The ArrayInsert function should make explicit casting when element type not 
> equals derived component type
> ---------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-45078
>                 URL: https://issues.apache.org/jira/browse/SPARK-45078
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.1
>            Reporter: Ran Tao
>            Priority: Major
>              Labels: pull-request-available
>
> Generally speaking, array_insert has same insert semantic with  
> array_prepend/array_append. however, if we run sql use element cast like 
> below, array_prepend/array_append can get right result. but array_insert 
> failed.
> {code:java}
> spark-sql (default)> select array_prepend(array(1), cast(2 as tinyint));
> [2,1]
> Time taken: 0.123 seconds, Fetched 1 row(s) {code}
> {code:java}
> spark-sql (default)> select array_append(array(1), cast(2 as tinyint)); 
> [1,2] 
> Time taken: 0.206 seconds, Fetched 1 row(s)
> {code}
> {code:java}
> spark-sql (default)> select array_insert(array(1), 2, cast(2 as tinyint));
> [DATATYPE_MISMATCH.ARRAY_FUNCTION_DIFF_TYPES] Cannot resolve 
> "array_insert(array(1), 2, CAST(2 AS TINYINT))" due to data type mismatch: 
> Input to `array_insert` should have been "ARRAY" followed by a value with 
> same element type, but it's ["ARRAY<INT>", "TINYINT"].; line 1 pos 7;
> 'Project [unresolvedalias(array_insert(array(1), 2, cast(2 as tinyint)), 
> None)]
> +- OneRowRelation {code}
> The reported error is clear, however, we may should do explicit casting here. 
> because multiset type such as array or map allow the operands of same type 
> family  to coexist.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to