Saravanan Raju created SPARK-31695:
--------------------------------------
Summary: BigDecimal setScale is not working in Spark UDF
Key: SPARK-31695
URL: https://issues.apache.org/jira/browse/SPARK-31695
Project: Spark
Issue Type: Bug
Components: Spark Core, SQL
Affects Versions: 2.3.4
Reporter: Saravanan Raju
0
I was trying to convert json column to map. I tried udf for converting json to
map. but it is not working as expected.
val df1 = Seq(("\{\"k\":10.004}")).toDF("json")
def udfJsonStrToMapDecimal = udf((jsonStr: String)=> \{ var
jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
jsonMap.map{case(k,v) =>
(k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
})
val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
scala> f.printSchema
root
|-- json: string (nullable = true)
|-- map: map (nullable = true)
| |-- key: string
| |-- value: decimal(38,18) (valueContainsNull = true){{}}
*instead of decimal(38,6) it converting the value as decimal(38,18)*
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]