This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new 669f6df  [SPARK-37692][DOCS] Fix an example of mixed interval fields 
in the SQL migration guide
669f6df is described below

commit 669f6dffec768ceb545573a24d9bd53183bfa827
Author: Max Gekk <[email protected]>
AuthorDate: Tue Dec 21 19:09:29 2021 +0900

    [SPARK-37692][DOCS] Fix an example of mixed interval fields in the SQL 
migration guide
    
    ### What changes were proposed in this pull request?
    Change a wrong example in the SQL migration guide, and align it to the 
description of behavior change.
    <img width="1022" alt="Screenshot 2021-12-21 at 12 00 53" 
src="https://user-images.githubusercontent.com/1580697/146901704-c635720b-a8ca-4ea0-bc79-0220b83f544f.png";>
    
    ### Why are the changes needed?
    Current example is incorrect, and can confuse users.
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    By running the example:
    ```sql
    spark-sql> select INTERVAL 1 day 1 hour;
    1 01:00:00.000000000
    spark-sql> select INTERVAL 1 month 1 hour;
    Error in query:
    Cannot mix year-month and day-time fields: INTERVAL 1 month 1 hour(line 1, 
pos 7)
    
    == SQL ==
    select INTERVAL 1 month 1 hour
    -------^^^
    
    spark-sql> set spark.sql.legacy.interval.enabled=true;
    spark.sql.legacy.interval.enabled   true
    spark-sql> select INTERVAL 1 month 1 hour;
    1 months 1 hours
    ```
    
    Closes #34969 from MaxGekk/fix-migr-guide-for-mixed-intervals.
    
    Authored-by: Max Gekk <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
    (cherry picked from commit e2eaffec48e30ba95b7663c04a00970ae1b3941d)
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 docs/sql-migration-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/sql-migration-guide.md b/docs/sql-migration-guide.md
index 6fcc059..97e3b10 100644
--- a/docs/sql-migration-guide.md
+++ b/docs/sql-migration-guide.md
@@ -85,7 +85,7 @@ license: |
 
   - In Spark 3.2, the unit-to-unit interval literals like `INTERVAL '1-1' YEAR 
TO MONTH` and the unit list interval literals like `INTERVAL '3' DAYS '1' HOUR` 
are converted to ANSI interval types: `YearMonthIntervalType` or 
`DayTimeIntervalType`. In Spark 3.1 and earlier, such interval literals are 
converted to `CalendarIntervalType`. To restore the behavior before Spark 3.2, 
you can set `spark.sql.legacy.interval.enabled` to `true`.
 
-  - In Spark 3.2, the unit list interval literals can not mix year-month 
fields (YEAR and MONTH) and day-time fields (WEEK, DAY, ..., MICROSECOND). For 
example, `INTERVAL 1 day 1 hour` is invalid in Spark 3.2. In Spark 3.1 and 
earlier, there is no such limitation and the literal returns value of 
`CalendarIntervalType`. To restore the behavior before Spark 3.2, you can set 
`spark.sql.legacy.interval.enabled` to `true`.
+  - In Spark 3.2, the unit list interval literals can not mix year-month 
fields (YEAR and MONTH) and day-time fields (WEEK, DAY, ..., MICROSECOND). For 
example, `INTERVAL 1 month 1 hour` is invalid in Spark 3.2. In Spark 3.1 and 
earlier, there is no such limitation and the literal returns value of 
`CalendarIntervalType`. To restore the behavior before Spark 3.2, you can set 
`spark.sql.legacy.interval.enabled` to `true`.
 
   - In Spark 3.2, Spark supports `DayTimeIntervalType` and 
`YearMonthIntervalType` as inputs and outputs of `TRANSFORM` clause in Hive 
`SERDE` mode, the behavior is different between Hive `SERDE` mode and `ROW 
FORMAT DELIMITED` mode when these two types are used as inputs. In Hive `SERDE` 
mode, `DayTimeIntervalType` column is converted to `HiveIntervalDayTime`, its 
string format is `[-]?d h:m:s.n`, but in `ROW FORMAT DELIMITED` mode the format 
is `INTERVAL '[-]?d h:m:s.n' DAY TO TIME`. I [...]
 

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to