[ 
https://issues.apache.org/jira/browse/SPARK-16646?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15388916#comment-15388916
 ] 

Hyukjin Kwon commented on SPARK-16646:
--------------------------------------

I see! please let me leave my thought as well just in case it is helpful.

Actually, I thought it should not lose values (fractions or integral parts) but 
just throw an exception if it goes over 38 while trying to find a tight common 
type (and this is what my PR does for now).

For JSON data source, it falls back to double type in schema inference in this 
case.

If it should allow the change in original values with such operations, how 
about falling back to double type?

> LEAST doesn't accept numeric arguments with different data types
> ----------------------------------------------------------------
>
>                 Key: SPARK-16646
>                 URL: https://issues.apache.org/jira/browse/SPARK-16646
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Cheng Lian
>            Assignee: Hyukjin Kwon
>
> {code:sql}
> SELECT LEAST(1, 1.5);
> {code}
> {noformat}
> Error: org.apache.spark.sql.AnalysisException: cannot resolve 'least(1, 
> CAST(2.1 AS DECIMAL(2,1)))' due to data type mismatch: The expressions should 
> all have the same type, got LEAST (ArrayBuffer(IntegerType, 
> DecimalType(2,1))).; line 1 pos 7 (state=,code=0)
> {noformat}
> This query works for 1.6.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to