shivsood commented on a change in pull request #26301:  [SPARK-29644][SQL] 
Corrected ShortType and ByteType mapping to SmallInt and TinyInt in JDBCUtils 
URL: https://github.com/apache/spark/pull/26301#discussion_r347725332
 
 

 ##########
 File path: 
external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/MsSqlServerIntegrationSuite.scala
 ##########
 @@ -59,7 +59,7 @@ class MsSqlServerIntegrationSuite extends 
DockerJDBCIntegrationSuite {
       """
         |INSERT INTO numbers VALUES (
         |0,
-        |255, 32767, 2147483647, 9223372036854775807,
+        |127, 32767, 2147483647, 9223372036854775807,
 
 Review comment:
   The behavior would be as follows.
   ### Overwrite scenario
   1. (ShortType -> Int) If a spark df has a ShortType, on overwrite a DBMSS 
table with type Int will get created. Because Spark ShortyType is -32768 to 
+32767 only these values can be written. 
   2. (ByteType -> SmallInt) In a spark df has a ByteType, on overwrite  DBMSS 
table with type Short will get created. Because Spark ShortyType is -128 to 
+127 only these values can be written. 
   
   ### Read scenario
   1. If an existing table in DBMSS has type TinyInt, a read in Spark would 
results in a ShortType. Because ShortType range in Spark in -32786 to +32768, 
DBMSS signed value -127 to +127 and unsigned range of 0 to 255 will be handled.
   
   2. If an existing table in DBMSS has type SmallInt, a read would result in 
Spark dataframe having a column type as Integer. Because Integer range in Spark 
in -2147483648 to +2147483648, DBMSS signed value --32768 to +32768 as well as 
unsigned range of 0 to 65535 will be handled.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to