[
https://issues.apache.org/jira/browse/SPARK-6888?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14493859#comment-14493859
]
Rene Treffer commented on SPARK-6888:
-------------------------------------
I've changed the Quirks for my personal use and to try it out.
Github:
https://github.com/rtreffer/spark/commit/9ca66d9bc62db3519276cfe5c88d20ccaab69ada
PR: https://github.com/apache/spark/pull/5498
> Make DriverQuirks editable
> --------------------------
>
> Key: SPARK-6888
> URL: https://issues.apache.org/jira/browse/SPARK-6888
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Reporter: Rene Treffer
> Priority: Minor
>
> JDBC type conversion is currently handled by spark with the help of
> DriverQuirks (org.apache.spark.sql.jdbc.DriverQuirks).
> However some cases can't be resolved, e.g. MySQL "BIGINT UNSIGNED". (other
> UNSIGNED conversions won't work either but could be resolved automatically by
> using the next larger type)
> An invalid type conversion (e.g. loading an unsigned bigint with the highest
> bit set as a long value) causes the jdbc driver to throw an exception.
> The target type is determined automatically and bound to the resulting
> DataFrame where it's immutable.
> Alternative solutions:
> - Subqueries. Produce extra load on the server
> - SQLContext / jdbc methods with schema support
> - Making it possible to change the schema of data frames
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]