rangadi commented on code in PR #40983:
URL: https://github.com/apache/spark/pull/40983#discussion_r1179717089
##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##########
@@ -39,13 +39,67 @@ private[sql] class ProtobufOptions(
val parseMode: ParseMode =
parameters.get("mode").map(ParseMode.fromString).getOrElse(FailFastMode)
- // Setting the `recursive.fields.max.depth` to 1 allows it to be recurse
once,
- // and 2 allows it to be recursed twice and so on. A value of
`recursive.fields.max.depth`
- // greater than 10 is not permitted. If it is not specified, the default
value is -1;
- // A value of 0 or below disallows any recursive fields. If a protobuf
- // record has more depth than the allowed value for recursive fields, it
will be truncated
- // and corresponding fields are ignored (dropped).
+ /**
+ * Adds support for recursive fields. If this option is is not specified,
recursive fields are
Review Comment:
Though not related to this PR, I expanded the documentation for
`recursive.fields.max.depth` with clarifying example.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]