FWIW, here is the stack trace on the consumer side:

java.util.NoSuchElementException
        at org.apache.avro.file.DataFileStream.next(DataFileStream.java:232)
        at org.apache.avro.file.DataFileStream.next(DataFileStream.java:220)
        at 
com..kafkatest.util.AvroLogEventDecoder.toEvent(AvroEventDecoder.java:58)
        at 
com..kafkatest.util.AvroEventDecoder.toEvent(AvroEventDecoder.java:19)
        at kafka.consumer.ConsumerIterator.makeNext(ConsumerIterator.scala:88)
        at kafka.consumer.ConsumerIterator.makeNext(ConsumerIterator.scala:32)
        at 
kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:59)
        at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:51)
        at com.kafkatest.AvroConsumerClient$1.run(AvroConsumerClient.java:50)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:680)


On 2012-09-24, at 3:44 PM, Frank Grimes <frankgrime...@yahoo.com> wrote:

> Hi Murtaza,
> 
> I don't think it's the same issue.
> I'm using the exact same schema on both ends. 
> I generated an IndexedRecord from an avsc file and both the encoder and 
> decoder reference the same data type (and therefore the same 
> IndexedRecord.$SCHEMA).
> 
> Also, the payload size mismatch is on the raw ByteBuffer before decoding 
> occurs, is it not? (No Avro/schema involved on the decoding side yet)
> 
> Regards,
> 
> Emanuel
> 
> 
> On 2012-09-24, at 3:29 PM, Murtaza Doctor <murt...@richrelevance.com> wrote:
> 
>> Hi Frank,
>> 
>> I had a similar issue, which was due to the avro schema mismatch on the
>> producer and consumer end. We are now working on an avro schema repository
>> service to handle this issue. If you can share your code snippet and stack
>> trace I will be able to better judge the issue.
>> 
>> Regards,
>> murtaza
>> 
>> On 9/24/12 10:57 AM, "Frank Grimes" <frankgrime...@yahoo.com> wrote:
>> 
>>> Hi All,
>>> 
>>> I'm new to Kafka and am having trouble sending/receiving messages in Avro
>>> format.
>>> I have Kafka 0.7.1-incubating talking to a standalone Zookeeper 3.3.6
>>> installation.
>>> The String producer/consumer examples in the quick start guide are
>>> working fine so I believe my setup and config are correct.
>>> 
>>> After getting that working, I decided to write an encoder/decoder for my
>>> custom Avro record structure.
>>> They use Avro's DataFileWriter/DataFileStream to encode/decode the
>>> Message payload.
>>> I've tested my encoder and decoder standalone and they are working fine.
>>> 
>>> The problem I am having is that when consuming these messages through a
>>> Kafka client they no longer deserialize correctly.
>>> Specifically, I get a NoSuchElementException thrown in
>>> DataFileStream.next().
>>> Also, I noticed that the Message payload size in the producer does not
>>> match the payload size in the consumer.
>>> 
>>> Producer payload size: 2404
>>> Consumer payload size: 2326
>>> 
>>> I tried disabling Avro compression but the problem remains.
>>> 
>>> Any help would be greatly appreciated...
>>> 
>>> Thanks,
>>> 
>>> Frank Grimes
>> 
> 

Reply via email to