cloud-fan commented on code in PR #38922:
URL: https://github.com/apache/spark/pull/38922#discussion_r1053473031
##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##########
@@ -38,6 +38,14 @@ private[sql] class ProtobufOptions(
val parseMode: ParseMode =
parameters.get("mode").map(ParseMode.fromString).getOrElse(FailFastMode)
+
+ // Setting the `recursive.fields.max.depth` to 0 drops all recursive fields,
+ // 1 allows it to be recurse once, and 2 allows it to be recursed twice and
so on.
+ // A value of `recursive.fields.max.depth` greater than 10 is not permitted.
If it is not
+ // specified, the default value is -1; recursive fields are not permitted.
If a protobuf
+ // record has more depth than the allowed value for recursive fields, it
will be truncated
+ // and some fields may be discarded.
+ val recursiveFieldMaxDepth: Int =
parameters.getOrElse("recursive.fields.max.depth", "-1").toInt
Review Comment:
The option name may need a bit more discussion. Usually data source options
do not have long names, and don't contains dot. See `JSONOptions`. How about
`maxRecursiveFieldDepth`?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]