Hi Murtaza, I don't think it's the same issue. I'm using the exact same schema on both ends. I generated an IndexedRecord from an avsc file and both the encoder and decoder reference the same data type (and therefore the same IndexedRecord.$SCHEMA).
Also, the payload size mismatch is on the raw ByteBuffer before decoding occurs, is it not? (No Avro/schema involved on the decoding side yet) Regards, Emanuel On 2012-09-24, at 3:29 PM, Murtaza Doctor <murt...@richrelevance.com> wrote: > Hi Frank, > > I had a similar issue, which was due to the avro schema mismatch on the > producer and consumer end. We are now working on an avro schema repository > service to handle this issue. If you can share your code snippet and stack > trace I will be able to better judge the issue. > > Regards, > murtaza > > On 9/24/12 10:57 AM, "Frank Grimes" <frankgrime...@yahoo.com> wrote: > >> Hi All, >> >> I'm new to Kafka and am having trouble sending/receiving messages in Avro >> format. >> I have Kafka 0.7.1-incubating talking to a standalone Zookeeper 3.3.6 >> installation. >> The String producer/consumer examples in the quick start guide are >> working fine so I believe my setup and config are correct. >> >> After getting that working, I decided to write an encoder/decoder for my >> custom Avro record structure. >> They use Avro's DataFileWriter/DataFileStream to encode/decode the >> Message payload. >> I've tested my encoder and decoder standalone and they are working fine. >> >> The problem I am having is that when consuming these messages through a >> Kafka client they no longer deserialize correctly. >> Specifically, I get a NoSuchElementException thrown in >> DataFileStream.next(). >> Also, I noticed that the Message payload size in the producer does not >> match the payload size in the consumer. >> >> Producer payload size: 2404 >> Consumer payload size: 2326 >> >> I tried disabling Avro compression but the problem remains. >> >> Any help would be greatly appreciated... >> >> Thanks, >> >> Frank Grimes >