MaxGekk commented on code in PR #44468:
URL: https://github.com/apache/spark/pull/44468#discussion_r1437681811
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/utils/ErrorUtils.scala:
##########
@@ -181,7 +182,12 @@ private[connect] object ErrorUtils extends Logging {
}
val errorClass = e.getErrorClass
if (errorClass != null && errorClass.nonEmpty) {
- errorInfo.putMetadata("errorClass", errorClass)
+ val messageParameters = JsonMethods.compact(
+
JsonMethods.render(map2jvalue(e.getMessageParameters.asScala.toMap)))
+ if (messageParameters.length <= maxMetadataSize) {
+ errorInfo.putMetadata("errorClass", errorClass)
+ errorInfo.putMetadata("messageParameters", messageParameters)
Review Comment:
In the PR https://github.com/apache/spark/pull/44464, I require all spark
exceptions must have an error class (cannot build from just a text message).
Found a few test failures at the client connect while creating a
`SparkThrowable` in `GrpcExceptionConverter` like at:
```scala
errorConstructor(params =>
new SparkNumberFormatException(
params.errorClass.orNull,
params.messageParameters,
params.queryContext)),
```
`SparkNumberFormatException`'s constructor fails because it cannot fill in
message parameters since we don't transfer them (the map is empty).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]