[ 
https://issues.apache.org/jira/browse/SPARK-31695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17106076#comment-17106076
 ] 

Hyukjin Kwon commented on SPARK-31695:
--------------------------------------

You can explicitly set the scale and precision:

{code}
val df1 = Seq(("{\"k\":10.004}")).toDF("json")
def udfJsonStrToMapDecimal = udf((jsonStr: String)=> { var 
jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
     jsonMap.map{case(k,v) => 
(k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
}, DecimalType(38, 6))
val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
f.printSchema
{code}

It's unable to automatically detect the scale that set during runtime.

> BigDecimal setScale is not working in Spark UDF
> -----------------------------------------------
>
>                 Key: SPARK-31695
>                 URL: https://issues.apache.org/jira/browse/SPARK-31695
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 2.3.4
>            Reporter: Saravanan Raju
>            Priority: Major
>
> I was trying to convert json column to map. I tried udf for converting json 
> to map. but it is not working as expected.
>   
> {code:java}
> val df1 = Seq(("{\"k\":10.004}")).toDF("json")
> def udfJsonStrToMapDecimal = udf((jsonStr: String)=> { var 
> jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
>      jsonMap.map{case(k,v) => 
> (k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
> })
> val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
> scala> f.printSchema
> root
>  |-- json: string (nullable = true)
>  |-- map: map (nullable = true)
>  |    |-- key: string
>  |    |-- value: decimal(38,18) (valueContainsNull = true)
> {code}
>  
> *instead of decimal(38,6) it converting the value as decimal(38,18)*



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to