heyihong commented on code in PR #54016:
URL: https://github.com/apache/spark/pull/54016#discussion_r2736479637
##########
sql/connect/common/src/main/scala/org/apache/spark/sql/connect/client/SparkConnectClient.scala:
##########
@@ -1103,7 +1104,13 @@ object SparkConnectClient {
responseListener: ClientCall.Listener[RespT],
headers: Metadata): Unit = {
metadata.foreach { case (key, value) =>
- headers.put(Metadata.Key.of(key,
Metadata.ASCII_STRING_MARSHALLER), value)
+ if (key != null && value != null &&
key.endsWith(Metadata.BINARY_HEADER_SUFFIX)) {
Review Comment:
I took a closer look at the code, and if I understand correctly, a null key
or null value in Configuration.metadata (it should at least use an empty
string) should be considered as invalid input. I was wondering whether it would
make sense to add an assertion during the construction of
MetadataHeaderClientInterceptor instead, as this could simplify the
null-checking logic a bit while providing stronger defensive guarantees.
```scala
private[client] class MetadataHeaderClientInterceptor(metadata: Map[String,
String]) extends ClientInterceptor {
metadata.foreach { case (key, value) => assert(key != null && value !=
null)}
...
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]