GitHub user dilipbiswal opened a pull request:
https://github.com/apache/spark/pull/22544
[SPARK-25522] Improve type promotion for input arguments of elementAt
function
## What changes were proposed in this pull request?
In ElementAt, when first argument is MapType, we should coerce the key type
and the second argument based on findTightestCommonType. This is not happening
currently.
Also, when the first argument is ArrayType, the second argument should be
an integer type or a smaller integral type that can be safely casted to an
integer type. Currently we may do an unsafe cast.
```SQL
spark-sql> select element_at(array(1,2), 1.24);
1
```
```SQL
spark-sql> select element_at(map(1,"one", 2, "two"), 2.2);
two
```
This PR also supports implicit cast between two MapTypes. I have followed
similar logic that exists today to do implicit casts between two array types.
## How was this patch tested?
Added new tests in DataFrameFunctionSuite, TypeCoercionSuite.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/dilipbiswal/spark SPARK-25522
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/22544.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #22544
----
commit 4d32f2ce2ff4d5e13c693053041d3111526662cb
Author: Dilip Biswal <dbiswal@...>
Date: 2018-09-16T02:36:18Z
[SPARK-25522] Improve type promotion for input arguments of elementAt
function.
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]