[
https://issues.apache.org/jira/browse/SPARK-6888?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Reynold Xin updated SPARK-6888:
-------------------------------
Assignee: Rene Treffer
> Make DriverQuirks editable
> --------------------------
>
> Key: SPARK-6888
> URL: https://issues.apache.org/jira/browse/SPARK-6888
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Reporter: Rene Treffer
> Assignee: Rene Treffer
> Priority: Minor
> Fix For: 1.4.0
>
>
> JDBC type conversion is currently handled by spark with the help of
> DriverQuirks (org.apache.spark.sql.jdbc.DriverQuirks).
> However some cases can't be resolved, e.g. MySQL "BIGINT UNSIGNED". (other
> UNSIGNED conversions won't work either but could be resolved automatically by
> using the next larger type)
> An invalid type conversion (e.g. loading an unsigned bigint with the highest
> bit set as a long value) causes the jdbc driver to throw an exception.
> The target type is determined automatically and bound to the resulting
> DataFrame where it's immutable.
> Alternative solutions:
> - Subqueries. Produce extra load on the server
> - SQLContext / jdbc methods with schema support
> - Making it possible to change the schema of data frames
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]