[
https://issues.apache.org/jira/browse/SPARK-7196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14521067#comment-14521067
]
Ken Geis commented on SPARK-7196:
---------------------------------
This does not fix my issue.
{noformat}
scala> val amounts = sqlContext.jdbc(coeusURL, "(SELECT total_direct_cost_total
FROM osp$proposal WHERE rownum < 2)")
amounts: org.apache.spark.sql.DataFrame = [TOTAL_DIRECT_COST_TOTAL:
decimal(10,0)]
scala>
amounts.schema(0).dataType.asInstanceOf[org.apache.spark.sql.types.DecimalType].precision
res8: Int = -1
scala>
amounts.schema(0).dataType.asInstanceOf[org.apache.spark.sql.types.DecimalType].scale
res9: Int = -1
scala>
amounts.schema(0).dataType.asInstanceOf[org.apache.spark.sql.types.DecimalType].precisionInfo
res10: Option[org.apache.spark.sql.types.PrecisionInfo] = None
scala> amounts.saveAsTable("amounts")
...
java.lang.RuntimeException: Unsupported datatype DecimalType()
{noformat}
> decimal precision lost when loading DataFrame from JDBC
> -------------------------------------------------------
>
> Key: SPARK-7196
> URL: https://issues.apache.org/jira/browse/SPARK-7196
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.1
> Reporter: Ken Geis
>
> I have a decimal database field that is defined as 10.2 (i.e. ##########.##).
> When I load it into Spark via sqlContext.jdbc(..), the type of the
> corresponding field in the DataFrame is DecimalType, with precisionInfo None.
> Because of that loss of precision information, SPARK-4176 is triggered when I
> try to .saveAsTable(..).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]