[
https://issues.apache.org/jira/browse/PHOENIX-3504?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15685124#comment-15685124
]
Sergey Soldatov commented on PHOENIX-3504:
------------------------------------------
One more observation on Decimal data type. According to ANSI Decimal is
supposed to be exact numeric (fixed point). Should we add default values for
precision and scale for Decimal in this case because at the moment without
specifying precision and scale we get approximate numeric which contradict with
the SQL standard.
[~jamestaylor] any thoughts?
> Spark integration doesn't work with decimal columns that are using default
> precision
> ------------------------------------------------------------------------------------
>
> Key: PHOENIX-3504
> URL: https://issues.apache.org/jira/browse/PHOENIX-3504
> Project: Phoenix
> Issue Type: Bug
> Affects Versions: 4.8.0
> Reporter: Sergey Soldatov
> Assignee: Sergey Soldatov
>
> Not sure when this issue was introduced and whether this code was working
> well before, but in PhoenixRDD.phoenixTypeToCatalystType for decimal
> precision we have a check
> (columnInfo.getPrecision < 0)
> which is fail for decimal columns that were created with default precision
> and scale because precision is null in this case.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)