pan3793 commented on code in PR #53018:
URL: https://github.com/apache/spark/pull/53018#discussion_r2521742982


##########
sql/connect/client/jdbc/src/main/scala/org/apache/spark/sql/connect/client/jdbc/util/JdbcTypeUtils.scala:
##########
@@ -101,6 +104,7 @@ private[jdbc] object JdbcTypeUtils {
     case StringType =>
       getPrecision(field)
     case DateType => 10 // length of `YYYY-MM-DD`
+    case BinaryType => Int.MaxValue

Review Comment:
   Databricks JDBC driver returns 1 for BinaryType, but I think it does not 
make sense.
   
   Trino JDBC driver defines a constant VARBINARY_MAX = 1024 * 1024 * 1024
   
   I think we'd better have a consistent default value for types that has 
actually no limited length, e.g., StringType and BinaryType
   
   



##########
sql/connect/client/jdbc/src/main/scala/org/apache/spark/sql/connect/client/jdbc/util/JdbcTypeUtils.scala:
##########
@@ -77,6 +79,7 @@ private[jdbc] object JdbcTypeUtils {
     case StringType => 255
     case DecimalType.Fixed(p, _) => p
     case DateType => 10
+    case BinaryType => Int.MaxValue

Review Comment:
   Databricks JDBC driver returns 1 for BinaryType, but I think it does not 
make sense.
   
   Trino JDBC driver defines a constant VARBINARY_MAX = 1024 * 1024 * 1024
   
   I think we'd better have a consistent default value for types that has 
actually no limited length, e.g., StringType and BinaryType



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to