Arun Jijo created SPARK-30646:
---------------------------------

             Summary: transform_keys function throws exception as "Cannot use 
null as map key", but there isn't any null key in the map 
                 Key: SPARK-30646
                 URL: https://issues.apache.org/jira/browse/SPARK-30646
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Arun Jijo


Have started experimenting Spark 3.0 new SQL functions and along the way found 
an issue with the *transform_keys* function. It is raising "Cannot use null as 
map key" exception but the Map actually doesn't hold any Null values.

Find my spark code below to reproduce the error.
{code:java}
val df = Seq(Map("EID_1"->10000,"EID_2"->25000)).toDF("employees")
df.withColumn("employees",transform_keys($"employees",(k,v)=>lit(k.+("XYX"))))
   .show
{code}
Exception in thread "main" java.lang.RuntimeException: *Cannot use null as map 
key*.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to