Hi,

This package is not dependant on any spefic Spark release and can be used
with 1.5 . Please refer to "How To" section here :

https://spark-packages.org/package/dibbhatt/kafka-spark-consumer

Also you will find more information in readme file how to use this package.

Regards,
Dibyendu


On Thu, Aug 25, 2016 at 7:01 PM, <mdkhajaasm...@gmail.com> wrote:

> Hi Dibyendu,
>
> Looks like it is available in 2.0, we are using older version of spark 1.5
> . Could you please let me know how to use this with older versions.
>
> Thanks,
> Asmath
>
> Sent from my iPhone
>
> On Aug 25, 2016, at 6:33 AM, Dibyendu Bhattacharya <
> dibyendu.bhattach...@gmail.com> wrote:
>
> Hi ,
>
> Released latest version of Receiver based Kafka Consumer for Spark
> Streaming.
>
> Receiver is compatible with Kafka versions 0.8.x, 0.9.x and 0.10.x and All
> Spark Versions
>
> Available at Spark Packages : https://spark-packages.org/
> package/dibbhatt/kafka-spark-consumer
>
> Also at github  : https://github.com/dibbhatt/kafka-spark-consumer
>
> Salient Features :
>
>    - End to End No Data Loss without Write Ahead Log
>    - ZK Based offset management for both consumed and processed offset
>    - No dependency on WAL and Checkpoint
>    - In-built PID Controller for Rate Limiting and Backpressure management
>    - Custom Message Interceptor
>
> Please refer to https://github.com/dibbhatt/kafka-spark-consumer/
> blob/master/README.md for more details
>
>
> Regards,
>
> Dibyendu
>
>
>

Reply via email to