uros-db commented on code in PR #54325:
URL: https://github.com/apache/spark/pull/54325#discussion_r2813844236
##########
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/RowSetUtils.scala:
##########
@@ -142,7 +142,11 @@ object RowSetUtils {
val value = if (row.isNullAt(ordinal)) {
""
} else {
- toHiveString((row.get(ordinal), typ), nested = true,
timeFormatters, binaryFormatter)
+ val nested = typ match {
+ case _: GeometryType | _: GeographyType => false
Review Comment:
The `toTColumn` catch-all in `RowSetUtils` converts all non-primitive values
to strings for Thrift. Currently, it always passes `nested = true`, which tells
string-like types to wrap their output in double quotes. This was originally
designed so that strings inside containers (arrays, maps, structs) are quoted
to avoid ambiguity with other delimiters - for example: `["hello","world"]`
rather than `[hello,world]`.
For geospatial types in particular, we added the same nested quoting in
`toHiveString`, e.g. for array: ["SRID=0;LINESTRING(0 0, 1 1)","SRID=0;POINT(1
2)"]. However, in Thrift's `toTColumn` path, `nested = true` is always set.
This is not because values are inside a container, but because they're being
serialized into a Thrift string column. This means even a standalone `SELECT
ST_GeomFromWKB(...)` would produce "POINT(1 2)" with bad quotes around the
single EWKT value.
So this additional pattern match with special case handling overrides
`nested` to `false` for (singular) geospatial types in the Thrift server path,
so standalone geospatial values display cleanly like: `POINT(1 2)`. The
pre-existing `nested = true` quoting still applies in other Hive CLI paths,
where values genuinely appear inside arrays, maps, or structs.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]