karim-ramadan opened a new pull request, #48905:
URL: https://github.com/apache/spark/pull/48905
### What changes were proposed in this pull request?
In this Pull request, I propose to add a LocalDateTime serializer to the
Row.jsonValue method
`
case (d: LocalDateTime, _) => JString(timestampFormatter.format(d))
`
In order to enable JSON serialization of _TimestampNTZType_ columns
### Why are the changes needed?
Currently trying to serialize a Row containing a _TimestampNTZType_ column
results in an error:
`[FAILED_ROW_TO_JSON] Failed to convert the row value '2018-05-14T12:13' of
the class class java.time.LocalDateTime to the target SQL type
"TIMESTAMPNTZTYPE" in the JSON format. SQLSTATE: 2203G
org.apache.spark.SparkIllegalArgumentException: [FAILED_ROW_TO_JSON] Failed
to convert the row value '2018-05-14T12:13' of the class class
java.time.LocalDateTime to the target SQL type "TIMESTAMPNTZTYPE" in the JSON
format. SQLSTATE: 2203G
at org.apache.spark.sql.Row.toJson$1(Row.scala:663)
at org.apache.spark.sql.Row.toJson$1(Row.scala:651)
at org.apache.spark.sql.Row.jsonValue(Row.scala:665)
at org.apache.spark.sql.Row.jsonValue$(Row.scala:598)
at
org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:28)
at
org.apache.spark.sql.RowJsonSuite.$anonfun$testJson$1(RowJsonSuite.scala:41)
`
How to reproduce the issue:
`
import org.apache.spark.sql.Row
import java.time.LocalDateTime
val r = Row.fromSeq(LocalDateTime.of(2024,8,10,12,33) :: Nil)
r.json
r.prettyJson
`
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Tests were added to the already existing RowJsonSuite.scala class
### Was this patch authored or co-authored using generative AI tooling?
No
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]