It seems that KafkaAvroDeserializer implements Deserializer<Object>, though
I suppose with proper configuration that Object will at run-time be your
desired type. Have you tried adding some Java type casts to make it compile?

On Wed, Oct 18, 2017 at 7:26 AM Tim Robertson <[email protected]>
wrote:

> I just tried quickly and see the same as you Andrew.
> We're missing something obvious or else extending KafkaAvroDeserializer seems
> necessary right?
>
> On Wed, Oct 18, 2017 at 3:14 PM, Andrew Jones <
> [email protected]> wrote:
>
>> Hi,
>>
>> I'm trying to read Avro data from a Kafka stream using KafkaIO. I think
>> it should be as simple as:
>>
>> p.apply(KafkaIO.<String, Envelope>*read*()
>>   .withValueDeserializerAndCoder(KafkaAvroDeserializer.class,
>>   AvroCoder.of(Envelope.class))
>>
>> Where Envelope is the name of the Avro class. However, that does not
>> compile and I get the following error:
>>
>> incompatible types:
>> java.lang.Class<io.confluent.kafka.serializers.KafkaAvroDeserializer>
>> cannot be converted to java.lang.Class<? extends
>>
>> org.apache.kafka.common.serialization.Deserializer<dbserver1.inventory.customers.Envelope>>
>>
>> I've tried a number of variations on this theme but haven't yet worked
>> it out and am starting to run out of ideas...
>>
>> Has anyone successfully read Avro data from Kafka?
>>
>> The code I'm using can be found at
>> https://github.com/andrewrjones/debezium-kafka-beam-example and a full
>> environment can be created with Docker.
>>
>> Thanks,
>> Andrew
>>
>
>

Reply via email to