shivsood commented on a change in pull request #26301: [SPARK-29644][SQL]
Corrected ShortType and ByteType mapping to SmallInt and TinyInt in JDBCUtils
URL: https://github.com/apache/spark/pull/26301#discussion_r347672314
##########
File path:
external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/MsSqlServerIntegrationSuite.scala
##########
@@ -59,7 +59,7 @@ class MsSqlServerIntegrationSuite extends
DockerJDBCIntegrationSuite {
"""
|INSERT INTO numbers VALUES (
|0,
- |255, 32767, 2147483647, 9223372036854775807,
+ |127, 32767, 2147483647, 9223372036854775807,
Review comment:
Thanks all for pointing out these issues. I had overlooked handling of
unsigned cases and the fact that each database may define on its own. I think
the problem exists for both SMALLINT and TINYINT.
- TINYINT is very clear that it can be range any where from -127 to +127 and
unsigned as 0 to 255. Both
[SQLServer](https://docs.microsoft.com/en-us/sql/t-sql/data-types/int-bigint-smallint-and-tinyint-transact-sql?view=sql-server-ver15)
and [MySQL] (https://dev.mysql.com/doc/refman/8.0/en/integer-types.html)
- SMALLINT can take an unsigned value 0 to 65535 per [MySQL]
(https://dev.mysql.com/doc/refman/8.0/en/integer-types.html)
My understanding is that there are no unsigned type in Spark. c.f.
https://spark.apache.org/docs/latest/sql-reference.html. Is that assertion
right?
Do we have a test for an integer where integer value is 4294967295. Per
[MySQL] (https://dev.mysql.com/doc/refman/8.0/en/integer-types.html)
documentation that's possible that an unsigned integer will have that value.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]