On Wed, Oct 18, 2017 at 10:35 AM, Eugene Kirpichov <kirpic...@google.com>
wrote:

> It seems that KafkaAvroDeserializer implements Deserializer<Object>,
> though I suppose with proper configuration that Object will at run-time be
> your desired type. Have you tried adding some Java type casts to make it
> compile?
>

+1, cast might be the simplest fix. Alternately you can wrap or
extend KafkaAvroDeserializer as Tim suggested. It would cast the Object
returned by KafkaAvroDeserializer::deserializer() to Envolope at runtime.


> On Wed, Oct 18, 2017 at 7:26 AM Tim Robertson <timrobertson...@gmail.com>
> wrote:
>
>> I just tried quickly and see the same as you Andrew.
>> We're missing something obvious or else extending KafkaAvroDeserializer seems
>> necessary right?
>>
>> On Wed, Oct 18, 2017 at 3:14 PM, Andrew Jones <
>> andrew+b...@andrew-jones.com> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to read Avro data from a Kafka stream using KafkaIO. I think
>>> it should be as simple as:
>>>
>>> p.apply(KafkaIO.<String, Envelope>*read*()
>>>   .withValueDeserializerAndCoder(KafkaAvroDeserializer.class,
>>>   AvroCoder.of(Envelope.class))
>>>
>>> Where Envelope is the name of the Avro class. However, that does not
>>> compile and I get the following error:
>>>
>>> incompatible types:
>>> java.lang.Class<io.confluent.kafka.serializers.KafkaAvroDeserializer>
>>> cannot be converted to java.lang.Class<? extends
>>> org.apache.kafka.common.serialization.Deserializer<
>>> dbserver1.inventory.customers.Envelope>>
>>>
>>> I've tried a number of variations on this theme but haven't yet worked
>>> it out and am starting to run out of ideas...
>>>
>>> Has anyone successfully read Avro data from Kafka?
>>>
>>> The code I'm using can be found at
>>> https://github.com/andrewrjones/debezium-kafka-beam-example and a full
>>> environment can be created with Docker.
>>>
>>> Thanks,
>>> Andrew
>>>
>>
>>

Reply via email to