Also you mentioned about streaming-kafka-0-10 connector, what connector is
this, do you know the dependency ? I did not see mention of it in the
documents
For current Spark 1.6.1 to Kafka 0.10.0.1 standalone, the only dependencies
I have are

org.apache.spark:spark-core_2.10:1.6.1
compile group: 'org.apache.spark', name: 'spark-streaming_2.10', version:'1.6.1'
compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.10',
version:'1.6.1'
compile group: 'org.apache.spark', name: 'spark-sql_2.10', version: '1.6.1'

For Spark 2.0 with Kafka 0.10.0.1 do I need to have a different kafka
connector dependency?


On Thu, Sep 22, 2016 at 2:21 PM, sagarcasual . <sagarcas...@gmail.com>
wrote:

> Hi Cody,
> Thanks for the response.
> One thing I forgot to mention is I am using a Direct Approach (No
> receivers) in Spark streaming.
>
> I am not sure if I have that leverage to upgrade at this point, but do you
> know if Spark 1.6.1 to Spark 2.0 jump is smooth usually or does it involve
> lot of hick-ups.
> Also is there a migration guide or something?
>
> -Regards
> Sagar
>
> On Thu, Sep 22, 2016 at 1:39 PM, Cody Koeninger <c...@koeninger.org>
> wrote:
>
>> Do you have the ability to try using Spark 2.0 with the
>> streaming-kafka-0-10 connector?
>>
>> I'd expect the 1.6.1 version to be compatible with kafka 0.10, but it
>> would be good to rule that out.
>>
>> On Thu, Sep 22, 2016 at 1:37 PM, sagarcasual . <sagarcas...@gmail.com>
>> wrote:
>> > Hello,
>> >
>> > I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using
>> Spark
>> > 1.6.1
>> > I am receiving following error, and I confirmed that Topic to which I am
>> > trying to connect exists with the data .
>> >
>> > Any idea what could be the case?
>> >
>> > kafka.common.UnknownTopicOrPartitionException
>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> > at
>> > sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>> > at
>> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > at java.lang.Class.newInstance(Class.java:442)
>> > at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.h
>> andleFetchErr(KafkaRDD.scala:184)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.f
>> etchBatch(KafkaRDD.scala:193)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.g
>> etNext(KafkaRDD.scala:208)
>> > at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
>> > at
>> > scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wr
>> appers.scala:29)
>> >
>> >
>>
>
>

Reply via email to