[
https://issues.apache.org/jira/browse/SPARK-8342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14584961#comment-14584961
]
Rene Treffer commented on SPARK-8342:
-------------------------------------
[~rxin] [~viirya] yes, I've tested with the patch applied (both the original
report, which is now fixed, and my original problem, which is not fixed) :-S
I've opened SPARK-8359 for the precision loss.
Sorry for the confusion, this patch still causes a similar problem:
{code}
import org.apache.spark.sql.types.Decimal
val d = Decimal(Long.MaxValue,100,0) * Decimal(Long.MaxValue,100,0)
d.toJavaBigDecimal.unscaledValue.toString
8507059173023461584739690778423250
{code}
But cross-checking with bc says it should be
85070591730234615847396907784232501249 ((2^63 - 1) * (2^63 - 1))
8507059173023461584739690778423250 is truncated.
Calling changePrecision(100,0) after the multiplication results in
85070591730234615847396907784232500000
Anyway, different bug, different ticket, although the problem is also present
in this case, it's just hidden behind another bug 0.o
> Decimal Math beyond ~2^112 is broken
> ------------------------------------
>
> Key: SPARK-8342
> URL: https://issues.apache.org/jira/browse/SPARK-8342
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.0
> Reporter: Rene Treffer
> Assignee: Liang-Chi Hsieh
> Fix For: 1.5.0
>
>
> Here is a snippet from the spark-shell that should not happen
> {code}
> scala> val d = Decimal(Long.MaxValue,100,0) * Decimal(Long.MaxValue,100,0)
> d: org.apache.spark.sql.types.Decimal = 0
> scala> d.toDebugString
> res3: String = Decimal(expanded,0,1,0})
> {code}
> It looks like precision gets reseted on some operations and values are then
> truncated.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]