[
https://issues.apache.org/jira/browse/BEAM-1573?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Ismaël Mejía updated BEAM-1573:
-------------------------------
Component/s: (was: z-do-not-use-sdk-java-extensions)
io-java-kafka
> KafkaIO does not allow using Kafka serializers and deserializers
> ----------------------------------------------------------------
>
> Key: BEAM-1573
> URL: https://issues.apache.org/jira/browse/BEAM-1573
> Project: Beam
> Issue Type: Improvement
> Components: io-java-kafka
> Affects Versions: 0.4.0, 0.5.0
> Reporter: peay
> Assignee: Raghu Angadi
> Priority: Minor
> Fix For: 2.0.0
>
>
> KafkaIO does not allow to override the serializer and deserializer settings
> of the Kafka consumer and producers it uses internally. Instead, it allows to
> set a `Coder`, and has a simple Kafka serializer/deserializer wrapper class
> that calls the coder.
> I appreciate that allowing to use Beam coders is good and consistent with the
> rest of the system. However, is there a reason to completely disallow to use
> custom Kafka serializers instead?
> This is a limitation when working with an Avro schema registry for instance,
> which requires custom serializers. One can write a `Coder` that wraps a
> custom Kafka serializer, but that means two levels of un-necessary wrapping.
> In addition, the `Coder` abstraction is not equivalent to Kafka's
> `Serializer` which gets the topic name as input. Using a `Coder` wrapper
> would require duplicating the output topic setting in the argument to
> `KafkaIO` and when building the wrapper, which is not elegant and error prone.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)