[
https://issues.apache.org/jira/browse/SPARK-19863?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon resolved SPARK-19863.
----------------------------------
Resolution: Incomplete
> Whether or not use CachedKafkaConsumer need to be configured, when you use
> DirectKafkaInputDStream to connect the kafka in a Spark Streaming application
> --------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-19863
> URL: https://issues.apache.org/jira/browse/SPARK-19863
> Project: Spark
> Issue Type: Bug
> Components: DStreams, Input/Output
> Affects Versions: 2.1.0
> Reporter: LvDongrong
> Priority: Major
> Labels: bulk-closed
>
> Whether or not use CachedKafkaConsumer need to be configured, when you use
> DirectKafkaInputDStream to connect the kafka in a Spark Streaming
> application. In Spark 2.x, the kafka consumer was replaced by
> CachedKafkaConsumer (some KafkaConsumer will keep establishing the kafka
> cluster), and cannot change the way. In fact ,The KafkaRDD(used by
> DirectKafkaInputDStream to connect kafka) provide the parameter
> useConsumerCache to choose Whether to use the CachedKafkaConsumer, but the
> DirectKafkaInputDStream set the parameter true.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]