Gengliang Wang created SPARK-44763:
--------------------------------------

             Summary: Fix a bug of promoting string as double in binary 
arithmetic with interval  
                 Key: SPARK-44763
                 URL: https://issues.apache.org/jira/browse/SPARK-44763
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 4.0.0
            Reporter: Gengliang Wang
            Assignee: Gengliang Wang


The following query works on branch-3.5 or below, but fails on the latest 
master:

```
select concat(DATE'2020-12-31', ' ', date_format('09:03:08', 'HH:mm:ss')) + 
(INTERVAL '03' HOUR)
```
 
The direct reason is now we mark `cast(date as string)` as resolved during type 
coercion after changes [https://github.com/apache/spark/pull/42089.] As a 
result, there are two transforms from CombinedTypeCoercionRule
```
Rule ConcatCoercion Transformed concat(2020-12-31,  , date_format(cast(09:03:08 
as timestamp), HH:mm:ss, Some(America/Los_Angeles))) to concat(cast(2020-12-31 
as string),  , date_format(cast(09:03:08 as timestamp), HH:mm:ss, 
Some(America/Los_Angeles)))

Rule PromoteStrings Transformed (concat(cast(2020-12-31 as string),  , 
date_format(cast(09:03:08 as timestamp), HH:mm:ss, Some(America/Los_Angeles))) 
+ INTERVAL '03' HOUR) to (cast(concat(cast(2020-12-31 as string),  , 
date_format(cast(09:03:08 as timestamp), HH:mm:ss, Some(America/Los_Angeles))) 
as double) + INTERVAL '03' HOUR)
```  
The second transform doesn't happen in previous releases since cast(2020-12-31 
as string)  used to be unresolved after the first transform.
 
The fix is simple, the analyzer should not promote string as double in binary 
arithmetic with ANSI interval. The changes in 
[https://github.com/apache/spark/pull/42089|https://github.com/apache/spark/pull/42089.]
 are valid and we should keep it.
 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to