iRakson commented on a change in pull request #26933: [SPARK-30292][SQL]Throw
Exception when invalid string is cast to decimal in ANSI mode
URL: https://github.com/apache/spark/pull/26933#discussion_r362015910
##########
File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/float8.sql
##########
@@ -45,15 +45,15 @@ SELECT double('infinity');
SELECT double(' -INFINiTY ');
-- [SPARK-27923] Spark SQL insert there bad special inputs to NULL
-- bad special inputs
-SELECT double('N A N');
-SELECT double('NaN x');
-SELECT double(' INFINITY x');
+-- SELECT double('N A N');
+-- SELECT double('NaN x');
+-- SELECT double(' INFINITY x');
Review comment:
After changing the functionality of CAST as per ANSI standards, few test
cases were failing. For these special inputs, earlier cast used to give `NULL`
as a result, but now an arithmetic exception is being thrown.
In some cases, test cases were written just to check whether NULL is
returned for these special inputs or not. The above queries are example of that.
There were 2 ways to handle these test cases according to me:
1. Comment them out.
2. set `spark.sql.ansi.enabled` to false for these particular queries.
I have followed the first way for handling these failing test cases.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]