I am using Spark 1.6.1 and Kafka 0.9+ It works for both receiver and 
receiver-less mode.

One thing I noticed when you specify invalid topic name, KafkaUtils doesn't 
fetch any messages. So, check you have specified the topic name correctly.

~Muthu
________________________________________
From: Mail.com [pradeep.mi...@mail.com]
Sent: Monday, May 16, 2016 9:33 PM
To: Ramaswamy, Muthuraman
Cc: Cody Koeninger; spark users
Subject: Re: KafkaUtils.createDirectStream Not Fetching Messages with Confluent 
Serializers as Value Decoder.

Hi Muthu,

Are you on spark 1.4.1 and Kafka 0.8.2? I have a similar issue even for simple 
string messages.

Console producer and consumer work fine. But spark always reruns empty RDD. I 
am using Receiver based Approach.

Thanks,
Pradeep

> On May 16, 2016, at 8:19 PM, Ramaswamy, Muthuraman 
> <muthuraman.ramasw...@viasat.com> wrote:
>
> Yes, I can see the messages. Also, I wrote a quick custom decoder for avro 
> and it works fine for the following:
>
>>> kvs = KafkaUtils.createDirectStream(ssc, [topic], {"metadata.broker.list": 
>>> brokers}, valueDecoder=decoder)
>
> But, when I use the Confluent Serializers to leverage the Schema Registry 
> (based on the link shown below), it doesn’t work for me. I am not sure 
> whether I need to configure any more details to consume the Schema Registry. 
> I can fetch the schema from the schema registry based on is Ids. The decoder 
> method is not returning any values for me.
>
> ~Muthu
>
>
>
>> On 5/16/16, 10:49 AM, "Cody Koeninger" <c...@koeninger.org> wrote:
>>
>> Have you checked to make sure you can receive messages just using a
>> byte array for value?
>>
>> On Mon, May 16, 2016 at 12:33 PM, Ramaswamy, Muthuraman
>> <muthuraman.ramasw...@viasat.com> wrote:
>>> I am trying to consume AVRO formatted message through
>>> KafkaUtils.createDirectStream. I followed the listed below example (refer
>>> link) but the messages are not being fetched by the Stream.
>>>
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__stackoverflow.com_questions_30339636_spark-2Dpython-2Davro-2Dkafka-2Ddeserialiser&d=CwIBaQ&c=jcv3orpCsv7C4ly8-ubDob57ycZ4jvhoYZNDBA06fPk&r=NQ-dw5X8CJcqaXIvIdMUUdkL0fHDonD07FZzTY3CgiU&m=Nc-rPMFydyCrwOZuNWs2GmSL4NkN8eGoR-mkJUlkCx0&s=hwqxCKl3P4_9pKWeo1OGR134QegMRe3Xh22_WMy-5q8&e=
>>>
>>> Is there any code missing that I must add to make the above sample work.
>>> Say, I am not sure how the confluent serializers would know the avro schema
>>> info as it knows only the Schema Registry URL info.
>>>
>>> Appreciate your help.
>>>
>>> ~Muthu
> ?B‹KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKCB•?È?[œÝXœØÜšX™K??K[XZ[?ˆ?\Ù\‹][œÝXœØÜšX™P?Ü?\šË˜\?XÚ?K›Ü™ÃB‘›Üˆ?Y??]?[Û˜[??ÛÛ[X[™?Ë??K[XZ[?ˆ?\Ù\‹Z?[???Ü?\šË˜\?XÚ?K›Ü™ÃBƒB

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to