[
https://issues.apache.org/jira/browse/SPARK-10648?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Yin Huai updated SPARK-10648:
-----------------------------
Fix Version/s: 1.5.3
1.4.2
> Spark-SQL JDBC fails to set a default precision and scale when they are not
> defined in an oracle schema.
> --------------------------------------------------------------------------------------------------------
>
> Key: SPARK-10648
> URL: https://issues.apache.org/jira/browse/SPARK-10648
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.0
> Environment: using oracle 11g, ojdbc7.jar
> Reporter: Travis Hegner
> Assignee: Travis Hegner
> Fix For: 1.4.2, 1.5.3, 1.6.0
>
>
> Using oracle 11g as a datasource with ojdbc7.jar. When importing data into a
> scala app, I am getting an exception "Overflowed precision". Some times I
> would get the exception "Unscaled value too large for precision".
> This issue likely affects older versions as well, but this was the version I
> verified it on.
> I narrowed it down to the fact that the schema detection system was trying to
> set the precision to 0, and the scale to -127.
> I have a proposed pull request to follow.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]