rangadi commented on code in PR #40983:
URL: https://github.com/apache/spark/pull/40983#discussion_r1180584516


##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/ProtobufDeserializer.scala:
##########
@@ -70,6 +69,15 @@ private[sql] class ProtobufDeserializer(
 
   def deserialize(data: Message): Option[InternalRow] = converter(data)
 
+  // JsonFormatter use to convert Any fields (if the option is enabled)
+  // This keeps original field names and does not include any extra whitespace 
in JSON.
+  // This is used to convert Any fields to json (if configured in Protobuf 
options).
+  // If the runtime type for Any field is not found in the registry, it throws 
an exception.
+  private val jsonPrinter = JsonFormat.printer
+    .omittingInsignificantWhitespace()
+    .preservingProtoFieldNames()

Review Comment:
   Yes, we do use field names as they appear in Protobuf definition. 
   The setting here only applies to json deserialization. Without this, 
"event_name" field in example will appear as "eventName" in the JSON.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to