yaooqinn commented on a change in pull request #26776: [SPARK-30147][SQL] Trim 
the string when cast string type to booleans
URL: https://github.com/apache/spark/pull/26776#discussion_r354722228
 
 

 ##########
 File path: docs/sql-migration-guide.md
 ##########
 @@ -222,10 +222,12 @@ license: |
 
   - Since Spark 3.0, when casting interval values to string type, there is no 
"interval" prefix, e.g. `1 days 2 hours`. In Spark version 2.4 and earlier, the 
string contains the "interval" prefix like `interval 1 days 2 hours`.
 
-  - Since Spark 3.0, when casting string value to integral types, including 
tinyint, smallint, int and bigint type, the leading and trailing white 
spaces(<= ACSII 32) will be trimmed before convert to integral values, e.g. 
`cast(' 1 ' as int)` results `1`. In Spark version 2.4 and earlier, the result 
will be `null`.
+  - Since Spark 3.0, when casting string value to integral types, including 
tinyint, smallint, int and bigint type, the leading and trailing whitespaces(<= 
ACSII 32) will be trimmed before convert to integral values, e.g. `cast(' 1 ' 
as int)` results `1`. In Spark version 2.4 and earlier, the result will be 
`null`.
+
+  - Since Spark 3.0, when casting string value to date, timestamp and interval 
values, the leading and trailing whitespaces(<= ACSII 32) will be trimmed 
before casing, e.g. `cast('2019-10-10\t as date)` results the date value 
`2019-10-10`. In Spark version 2.4 and earlier, only the trailing space will be 
removed, thus, the result is `null`.
 
 Review comment:
   sgtm

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to