yaooqinn opened a new pull request #26410: [SPARK-29387][SQL][FOLLOWUP] Fix 
issues of the multiply and divide for intervals
URL: https://github.com/apache/spark/pull/26410
 
 
   ### What changes were proposed in this pull request?
   
   Handle the inconsistence dividing zeros between literals and columns. 
   fix the null issue too.
   
   
   ### Why are the changes needed?
   BUG FIX
   ### 1 Handle the inconsistence dividing zeros between literals and columns
   ```sql
   -- !query 24
   select
       k,
       v,
       cast(k as interval) / v,
       cast(k as interval) * v
   from VALUES
        ('1 seconds', 1),
        ('2 seconds', 0),
        ('3 seconds', null),
        (null, null),
        (null, 0) t(k, v)
   -- !query 24 schema
   struct<k:string,v:int,divide_interval(CAST(k AS INTERVAL), CAST(v AS 
DOUBLE)):interval,multiply_interval(CAST(k AS INTERVAL), CAST(v AS 
DOUBLE)):interval>
   -- !query 24 output
   1 seconds   1   interval 1 seconds  interval 1 seconds
   2 seconds   0   interval 0 microseconds interval 0 microseconds
   3 seconds   NULL    NULL    NULL
   NULL    0   NULL    NULL
   NULL    NULL    NULL    NULL
   ```
   ```sql
   -- !query 21
   select interval '1 year 2 month' / 0
   -- !query 21 schema
   struct<divide_interval(interval 1 years 2 months, CAST(0 AS 
DOUBLE)):interval>
   -- !query 21 output
   NULL
   ```
   
   in the first case, interval ’2 seconds ‘ / 0, it produces interval 0 
microseconds 
   in the second case, it is null
   
   ### 2
   
   ```sql
   
     -- !query 20
   select interval '1 year 2 month' / null
   -- !query 20 schema
   struct<>
   -- !query 20 output
   org.apache.spark.sql.AnalysisException
   cannot resolve '(interval 1 years 2 months / NULL)' due to data type 
mismatch: differing types in '(interval 1 years 2 months / NULL)' (interval and 
null).; line 1 pos 7
   
   
   -- !query 22
   select interval '4 months 2 weeks 6 days' * null
   -- !query 22 schema
   struct<>
   -- !query 22 output
   org.apache.spark.sql.AnalysisException
   cannot resolve '(interval 4 months 20 days * NULL)' due to data type 
mismatch: differing types in '(interval 4 months 20 days * NULL)' (interval and 
null).; line 1 pos 7
   
   
   -- !query 23
   select null * interval '4 months 2 weeks 6 days'
   -- !query 23 schema
   struct<>
   -- !query 23 output
   org.apache.spark.sql.AnalysisException
   cannot resolve '(NULL * interval 4 months 20 days)' due to data type 
mismatch: differing types in '(NULL * interval 4 months 20 days)' (null and 
interval).; line 1 pos 7
   ```
    dividing or multiplying null literals, error occurs; where in column is 
fine as the first case 
   ### Does this PR introduce any user-facing change?
   <!--
   If yes, please clarify the previous behavior and the change this PR proposes 
- provide the console output, description and/or an example to show the 
behavior difference if possible.
   If no, write 'No'.
   -->
   NO, maybe yes, but it is just a follow-up
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some 
test cases that check the changes thoroughly including negative and positive 
cases if possible.
   If it was tested in a way different from regular unit tests, please clarify 
how you tested step by step, ideally copy and paste-able, so that other 
reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why 
it was difficult to add.
   -->
   add uts
   
   cc @cloud-fan @MaxGekk @maropu 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to