Paul Zaczkieiwcz created SPARK-20341:
----------------------------------------
Summary: Support BigIngeger values > 19 precision
Key: SPARK-20341
URL: https://issues.apache.org/jira/browse/SPARK-20341
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.1.0, 2.0.2
Reporter: Paul Zaczkieiwcz
If you create a {{Dataset\[scala.math.BigInt\]}}, then you can't have a
precision > 19.
{code}
scala> case class BigIntWrapper(value:scala.math.BigInt)
defined class BigIntWrapper
scala> val longDf =
spark.createDataset(BigIntWrapper(scala.math.BigInt("10000000000000000002"))::Nil)
17/04/14 19:45:15 main INFO CodeGenerator: Code generated in 211.949738 ms
java.lang.RuntimeException: Error while encoding:
java.lang.IllegalArgumentException: requirement failed: BigInteger
10000000000000000002 too large for decimal
staticinvoke(class org.apache.spark.sql.types.Decimal$, DecimalType(38,0),
apply, assertnotnull(input[0, BigIntWrapper, true], top level non-flat input
object).value, true) AS value#16
+- staticinvoke(class org.apache.spark.sql.types.Decimal$, DecimalType(38,0),
apply, assertnotnull(input[0, BigIntWrapper, true], top level non-flat input
object).value, true)
+- assertnotnull(input[0, BigIntWrapper, true], top level non-flat input
object).value
+- assertnotnull(input[0, BigIntWrapper, true], top level non-flat input
object)
+- input[0, BigIntWrapper, true]
at
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:280)
at org.apache.spark.sql.SparkSession$$anonfun$3.apply(SparkSession.scala:421)
at org.apache.spark.sql.SparkSession$$anonfun$3.apply(SparkSession.scala:421)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:285)
at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:421)
... 54 elided
Caused by: java.lang.IllegalArgumentException: requirement failed: BigInteger
10000000000000000002 too large for decimal
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.sql.types.Decimal.set(Decimal.scala:137)
at org.apache.spark.sql.types.Decimal$.apply(Decimal.scala:419)
at org.apache.spark.sql.types.Decimal.apply(Decimal.scala)
at
org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown
Source)
at
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:277)
... 62 more
{code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]