codope commented on code in PR #6111: URL: https://github.com/apache/hudi/pull/6111#discussion_r951401598
########## rfc/rfc-57/rfc-57.md: ########## @@ -0,0 +1,76 @@ +<!-- + Licensed to the Apache Software Foundation (ASF) under one or more + contributor license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright ownership. + The ASF licenses this file to You under the Apache License, Version 2.0 + (the "License"); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. +--> +# RFC-56: DeltaStreamer Protobuf Support + + + +## Proposers + +- @the-other-tim-brown + +## Approvers +- @bhasudha +- @vinothchandar + +## Status + +JIRA: https://issues.apache.org/jira/browse/HUDI-4399 + +> Please keep the status updated in `rfc/README.md`. + +## Abstract + +Support consuming Protobuf messages from Kafka with the DeltaStreamer. + +## Background +Hudi's DeltaStreamer currently supports consuming Avro and JSON data from Kafka but it does not support Protobuf. Adding support will require: +1. Parsing the data from Kafka into Protobuf Messages +2. Generating a schema from a Protobuf Message class +3. Converting from Protobuf to Avro + +## Implementation + +### Parsing Data from Kafka +Users will provide a classname for the Protobuf Message that is contained within a jar that is on the path. We will then implement a deserializer that parses the bytes from the kafka message into a protobuf Message. + +Configuration options: +hoodie.deltastreamer.schemaprovider.proto.className - The class to use + +### ProtobufClassBasedSchemaProvider +This new SchemaProvider will allow the user to provide a Protobuf Message class and get an Avro Schema. In the proto world, there is no concept of a nullable field so people use wrapper types such as Int32Value and StringValue to represent a nullable field. The schema provider will also allow the user to treat these wrapper fields as nullable versions of the fields they are wrapping instead of treating them as a nested message. In practice, this means that the user can choose between representing a field `Int32Value my_int = 1;` as `my_int.value` or simply `my_int` when writing the data out to the file system. Review Comment: Let's elaborate more on certain schema aspects. You have already covered nullable fields. Other things to consider: 1. Unsigned types: Avro doesn't support unsigned types. So, probably Long type. 2. Schema evolution: Both handle schema evolution differently. I think adding and removing fields should be ok as long as we consider default value in avro while converting. ########## rfc/rfc-57/rfc-57.md: ########## @@ -0,0 +1,76 @@ +<!-- + Licensed to the Apache Software Foundation (ASF) under one or more + contributor license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright ownership. + The ASF licenses this file to You under the Apache License, Version 2.0 + (the "License"); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. +--> +# RFC-56: DeltaStreamer Protobuf Support + + + +## Proposers + +- @the-other-tim-brown + +## Approvers +- @bhasudha +- @vinothchandar + +## Status + +JIRA: https://issues.apache.org/jira/browse/HUDI-4399 + +> Please keep the status updated in `rfc/README.md`. + +## Abstract + +Support consuming Protobuf messages from Kafka with the DeltaStreamer. + +## Background +Hudi's DeltaStreamer currently supports consuming Avro and JSON data from Kafka but it does not support Protobuf. Adding support will require: +1. Parsing the data from Kafka into Protobuf Messages +2. Generating a schema from a Protobuf Message class +3. Converting from Protobuf to Avro + +## Implementation + +### Parsing Data from Kafka +Users will provide a classname for the Protobuf Message that is contained within a jar that is on the path. We will then implement a deserializer that parses the bytes from the kafka message into a protobuf Message. + +Configuration options: +hoodie.deltastreamer.schemaprovider.proto.className - The class to use + +### ProtobufClassBasedSchemaProvider +This new SchemaProvider will allow the user to provide a Protobuf Message class and get an Avro Schema. In the proto world, there is no concept of a nullable field so people use wrapper types such as Int32Value and StringValue to represent a nullable field. The schema provider will also allow the user to treat these wrapper fields as nullable versions of the fields they are wrapping instead of treating them as a nested message. In practice, this means that the user can choose between representing a field `Int32Value my_int = 1;` as `my_int.value` or simply `my_int` when writing the data out to the file system. + +Configuration Options: +hoodie.deltastreamer.schemaprovider.proto.className - The class to use +hoodie.deltastreamer.schemaprovider.proto.flattenWrappers (Default: false) - By default the wrapper classes will be treated like any other message and have a nested `value` field. When this is set to true, we do not have a nested `value` field and treat the field as nullable in the generated Schema + +### ProtoToAvroConverter + +A class will be created that can take in a Protobuf Message and convert it to an Avro GenericRecord. This will be used inside the SourceFormatAdapter to properly convert to an avro RDD. To convert to `Dataset<Row>` we will first convert to Avro and then to Row. This change will be adding a new `Source.SourceType` as well so other sources in the future can implement this source type, for example Protobuf messages on PubSub. Review Comment: > A class will be created What do you mean by this? When will this class be created? Do users need to implement an interface provided by us? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
