Github user dongjoon-hyun commented on a diff in the pull request: https://github.com/apache/spark/pull/20023#discussion_r158154036 --- Diff: sql/core/src/test/resources/sql-tests/inputs/decimals.sql --- @@ -0,0 +1,16 @@ +-- tests for decimals handling in operations +-- Spark draws its inspiration byt Hive implementation +create table decimals_test(id int, a decimal(38,18), b decimal(38,18)) using parquet; + +insert into decimals_test values(1, 100.0, 999.0); +insert into decimals_test values(2, 12345.123, 12345.123); +insert into decimals_test values(3, 0.1234567891011, 1234.1); +insert into decimals_test values(4, 123456789123456789.0, 1.123456789123456789); --- End diff -- nit. How about making into one SQL statement? ```sql insert into decimals_test values (1, 100.0, 999.0), (2, 12345.123, 12345.123), (3, 0.1234567891011, 1234.1), (4, 123456789123456789.0, 1.123456789123456789) ```
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org