yaooqinn commented on code in PR #46062:
URL: https://github.com/apache/spark/pull/46062#discussion_r1566653317
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##########
@@ -467,12 +467,8 @@ object JdbcUtils extends Logging with SQLConfHelper {
case StringType if metadata.contains("rowid") =>
(rs: ResultSet, row: InternalRow, pos: Int) =>
- val rawRowId = rs.getRowId(pos + 1)
- if (rawRowId == null) {
- row.update(pos, null)
- } else {
- row.update(pos, UTF8String.fromString(rawRowId.toString))
- }
+ val id = nullSafeConvert[RowId](rs.getRowId(pos + 1), r =>
UTF8String.fromBytes(r.getBytes))
Review Comment:
The assumption might be practical as JDBC clients use the platform's default
charset. However, I guess we cannot make such an assumption. I might close this
first to see if we can apply the charset suitably for the client-server encode
conversion.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]