rangadi commented on code in PR #40686:
URL: https://github.com/apache/spark/pull/40686#discussion_r1181711867


##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##########
@@ -46,6 +46,41 @@ private[sql] class ProtobufOptions(
   // record has more depth than the allowed value for recursive fields, it 
will be truncated
   // and corresponding fields are ignored (dropped).
   val recursiveFieldMaxDepth: Int = 
parameters.getOrElse("recursive.fields.max.depth", "-1").toInt
+
+  // Whether to deserialize empty proto3 scalar fields as their default values.
+  // When a proto3 scalar field is empty, there is ambiguity as to whether the 
field
+  // was never set or if the field was explicitly set to zero. By default, the 
library

Review Comment:
   > Other official libraries that take protobufs and put them into a different 
format (e.g. Json) have the option to emit defaults. This PR is implementing 
that exact same functionality, with same semantics.
   
   
   Lets add this to the documentation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to