dongjoon-hyun commented on a change in pull request #26492: 
[SPARK-28885][SQL][FOLLOW-UP] Re-enable the ported PgSQL regression tests of 
SQLQueryTestSuite 
URL: https://github.com/apache/spark/pull/26492#discussion_r346095805
 
 

 ##########
 File path: sql/core/src/test/resources/sql-tests/inputs/postgreSQL/numeric.sql
 ##########
 @@ -657,16 +674,18 @@ SELECT AVG(val) FROM num_data;
 
 -- Check for appropriate rounding and overflow
 CREATE TABLE fract_only (id int, val decimal(4,4)) USING parquet;
-INSERT INTO fract_only VALUES (1, '0.0');
-INSERT INTO fract_only VALUES (2, '0.1');
+-- PostgreSQL implicitly casts string literals to data with decimal types, but
+-- Spark does not support that kind of implicit casts.
+INSERT INTO fract_only VALUES (1, cast('0.0' as decimal(4,4)));
+INSERT INTO fract_only VALUES (2, cast('0.1' as decimal(4,4)));
 -- [SPARK-27923] PostgreSQL throws an exception but Spark SQL is NULL
--- INSERT INTO fract_only VALUES (3, '1.0');   -- should fail
-INSERT INTO fract_only VALUES (4, '-0.9999');
-INSERT INTO fract_only VALUES (5, '0.99994');
+-- INSERT INTO fract_only VALUES (3, '1.0' as decimal(4,4));   -- should fail
+INSERT INTO fract_only VALUES (4, cast('-0.9999' as decimal(4,4)));
+INSERT INTO fract_only VALUES (5, cast('0.99994' as decimal(4,4)));
 
 Review comment:
   Oh, it was just a comment to clarify. I think it's okay for now `AS-IS` 
since we don't support the original query in any way.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to