the-other-tim-brown commented on code in PR #6111:
URL: https://github.com/apache/hudi/pull/6111#discussion_r955348242


##########
rfc/rfc-57/rfc-57.md:
##########
@@ -0,0 +1,85 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one or more
+ contributor license agreements.  See the NOTICE file distributed with
+ this work for additional information regarding copyright ownership.
+ The ASF licenses this file to You under the Apache License, Version 2.0
+ (the "License"); you may not use this file except in compliance with
+ the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+# RFC-57: DeltaStreamer Protobuf Support
+
+
+
+## Proposers
+
+- @the-other-tim-brown
+
+## Approvers
+- @bhasudha
+- @vinothchandar
+
+## Status
+
+JIRA: https://issues.apache.org/jira/browse/HUDI-4399
+
+> Please keep the status updated in `rfc/README.md`.
+
+## Abstract
+
+Support consuming Protobuf messages from Kafka with the DeltaStreamer.
+
+## Background
+Hudi's DeltaStreamer currently supports consuming Avro and JSON data from 
Kafka but it does not support Protobuf. Adding support will require:
+1. Parsing the data from Kafka into Protobuf Messages
+2. Generating a schema from a Protobuf Message class
+3. Converting from Protobuf to Avro
+
+## Implementation
+
+### Parsing Data from Kafka
+Users will provide a classname for the Protobuf Message that is contained 
within a jar that is on the path. We will then implement a deserializer that 
parses the bytes from the kafka message into a protobuf Message.
+
+Configuration options:
+hoodie.deltastreamer.schemaprovider.proto.className - The class to use
+
+### ProtobufClassBasedSchemaProvider
+This new SchemaProvider will allow the user to provide a Protobuf Message 
class and get an Avro Schema. In the proto world, there is no concept of a 
nullable field so people use wrapper types such as Int32Value and StringValue 
to represent a nullable field. The schema provider will also allow the user to 
treat these wrapper fields as nullable versions of the fields they are wrapping 
instead of treating them as a nested message. In practice, this means that the 
user can choose between representing a field `Int32Value my_int = 1;` as 
`my_int.value` or simply `my_int` when writing the data out to the file system.
+
+#### Handling of Unsigned Integers and Longs
+Protobuf provides support for unsigned integers and longs while Avro does not. 
The schema provider will convert unsigned integers and longs to Avro long type 
in the schema definition.
+
+#### Schema Evolution
+**Adding a Field:**
+Protobuf has a default value for all fields and the translation from proto to 
avro schema will carry over this default value so there are no errors when 
adding a new field to the proto definition.
+**Removing a Field:**
+If a user removes a field in the Protobuf schema, the schema provider will not 
be able to add this field to the avro schema it generates. To avoid issues when 
writing data, users must use `hoodie.datasource.write.reconcile.schema=true` to 
properly reconcile the schemas if a field is removed from the proto definition. 
Users can avoid this situation by using `deprecated` field option in proto 
instead of removing the field from the schema.
+
+Configuration Options:
+hoodie.deltastreamer.schemaprovider.proto.className - The class to use
+hoodie.deltastreamer.schemaprovider.proto.flattenWrappers (Default: false) - 
By default the wrapper classes will be treated like any other message and have 
a nested `value` field. When this is set to true, we do not have a nested 
`value` field and treat the field as nullable in the generated Schema
+
+### ProtoToAvroConverter

Review Comment:
   This is done to reduce the scope of the initial changes. The converter utils 
for generating a schema and converting into an Avro with that schema got a bit 
large so I was trying to reuse the avro to row logic for now. I can follow up 
with a direct converter. I think going from proto to avro though will have 
similar perf to the json to avro transformation that exists today. Proto to 
avro to row is definitely something to improve on though.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to