planga82 commented on a change in pull request #33525:
URL: https://github.com/apache/spark/pull/33525#discussion_r677822350
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala
##########
@@ -390,11 +390,15 @@ class JsonFunctionsSuite extends QueryTest with
SharedSparkSession {
test("SPARK-24027: from_json of a map with unsupported key type") {
val schema = MapType(StructType(StructField("f", IntegerType) :: Nil),
StringType)
-
- checkAnswer(Seq("""{{"f": 1}: "a"}""").toDS().select(from_json($"value",
schema)),
- Row(null))
Review comment:
@cloud-fan
The difference is the type of the key
```
scala> val schema = MapType(StructType(StructField("f", IntegerType) ::
Nil), StringType)
scala> Seq("""{{"f": 1}: "a"}""").toDS().select(from_json($"value",
schema)).show
+-------+
|entries|
+-------+
| null|
+-------+
scala> val schema2 = MapType(TimestampType, StringType)
scala> Seq("""{"2021-05-05T20:05:08":
"a"}""").toDS().select(from_json($"value", schema2)).show
java.lang.ClassCastException: org.apache.spark.unsafe.types.UTF8String
cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:107)
at
org.apache.spark.sql.catalyst.expressions.CastBase.$anonfun$castToString$8$adapted(Cast.scala:297)
at
org.apache.spark.sql.catalyst.expressions.CastBase.buildCast(Cast.scala:285)
...
```
I think the StructType in the key throws the exception in a different point
and it's treated in a different way. Now it's homogeneous so an exception is
thrown.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]