MaxGekk opened a new pull request #25865: [SPARK-29187][SQL] Return null from 
`date_part()` for the null `field`
URL: https://github.com/apache/spark/pull/25865
 
 
   ### What changes were proposed in this pull request?
   
   In the PR, I propose to change behavior of the `date_part()` function in 
handling `null` field, and make it the same as PostgreSQL has. If `field` 
parameter is `null`, the function should return `null` of the `double` type as 
PostgreSQL does:
   ```sql
   # select pg_typeof(date_part(null, date '2019-09-20'));
       pg_typeof     
   ------------------
    double precision
   (1 row)
   ```
   
   ### Why are the changes needed?
   The `date_part()` function was added to maintain feature parity with 
PostgreSQL but current behavior of the function is different in handling null 
as `field`.
   
   ### Does this PR introduce any user-facing change?
   Yes.
   
   Before:
   ```sql
   spark-sql> select date_part(null, date'2019-09-20');
   Error in query: null; line 1 pos 7
   ```
   
   After:
   ```sql
   spark-sql> select date_part(null, date'2019-09-20');
   NULL
   ```
   
   ### How was this patch tested?
   Add new tests to `DateFunctionsSuite for 2 cases:
   - `field` = `null`, `source` = a date literal
   - `field` = `null`, `source` = a date column 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to