[ 
https://issues.apache.org/jira/browse/SPARK-7196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14522904#comment-14522904
 ] 

Liang-Chi Hsieh commented on SPARK-7196:
----------------------------------------

[~kgeis] Yes. Looks like the BigDecimal returned from Oracle has different 
scale compared to the field definition.

> decimal precision lost when loading DataFrame from JDBC
> -------------------------------------------------------
>
>                 Key: SPARK-7196
>                 URL: https://issues.apache.org/jira/browse/SPARK-7196
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.1
>            Reporter: Ken Geis
>            Assignee: Liang-Chi Hsieh
>             Fix For: 1.3.2, 1.4.0
>
>
> I have a decimal database field that is defined as 10.2 (i.e. ##########.##). 
> When I load it into Spark via sqlContext.jdbc(..), the type of the 
> corresponding field in the DataFrame is DecimalType, with precisionInfo None. 
> Because of that loss of precision information, SPARK-4176 is triggered when I 
> try to .saveAsTable(..).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to