[
https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15331026#comment-15331026
]
Jinxia Liu commented on SPARK-12177:
------------------------------------
[[email protected]] thanks for the quick reply.
1. glad to know you are checking it.
2. the kafka0.10 consumer is not difficult to use, I agree, but in most cases
with the connector, consumer gets assigned the topics, not subscribe to them,
the connector needs to know all the partitions of a topic, if the upstream
kafka gets changed, the consumer code needs change manually. Maybe there is two
sides of this issue, since you are against this, lets keep the code as it is
now.
> Update KafkaDStreams to new Kafka 0.10 Consumer API
> ---------------------------------------------------
>
> Key: SPARK-12177
> URL: https://issues.apache.org/jira/browse/SPARK-12177
> Project: Spark
> Issue Type: Improvement
> Components: Streaming
> Affects Versions: 1.6.0
> Reporter: Nikita Tarasenko
> Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not
> compatible with old one. So, I added new consumer api. I made separate
> classes in package org.apache.spark.streaming.kafka.v09 with changed API. I
> didn't remove old classes for more backward compatibility. User will not need
> to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]