Ken Geis created SPARK-7196:
-------------------------------

             Summary: decimal precision lost when loading DataFrame from JDBC
                 Key: SPARK-7196
                 URL: https://issues.apache.org/jira/browse/SPARK-7196
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.1
            Reporter: Ken Geis


I have a decimal database field that is defined as 10.2 (i.e. ##########.##). 
When I load it into Spark via sqlContext.jdbc(..), the type of the 
corresponding field in the DataFrame is DecimalType, with precisionInfo None. 
Because of that loss of precision information, SPARK-4176 is triggered when I 
try to .saveAsTable(..).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to