Hi,
I am new to spark scala development.I have created job to read data from
mysql table using existing data source API(Spark jdbc). I have question
regarding data type mapping, these are as follows.
*Scenario 1:*
I have created table with float type in mysql but while reading through
spark jdbc i am getting DoubleType.

*Scenario 2:* 
I have created table with SMALLINT type in mysql but while reading through
spark jdbc i am getting IntegerType.

These mapping had done in/
*org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD*/ object.
Here my question is : 
*1. Why is  Float mapped to DoubleType and SMALLINT mapped to IntegerType?
2. Why can we not handle Float and SMALLINT in  MySQLDialect, as Binary is
already handled?*

I am using below version of jar : 
mysql-connector-java-5.1.34.jar
spark-core_2.11 version: '2.0.2'
spark-sql_2.11  version: '2.0.2'

Thanks
Santlal J. Gupta



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-JDBC-Data-type-mapping-Float-and-smallInt-Issue-tp28295.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to