[ 
https://issues.apache.org/jira/browse/SPARK-5505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cody Koeninger closed SPARK-5505.
---------------------------------
    Resolution: Won't Fix

The old kafka High Level Consumer has been abandoned at this point.  
SPARK-12177 and SPARK-15406 use the new consumer api.

> ConsumerRebalanceFailedException from Kafka consumer
> ----------------------------------------------------
>
>                 Key: SPARK-5505
>                 URL: https://issues.apache.org/jira/browse/SPARK-5505
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.2.0
>         Environment: CentOS6 / Linux 2.6.32-358.2.1.el6.x86_64
> java version "1.7.0_21"
> Scala compiler version 2.9.3
> 2 cores Intel(R) Xeon(R) CPU E5620  @ 2.40GHz / 16G RAM
> VMWare VM.
>            Reporter: Greg Temchenko
>            Priority: Critical
>
> From time to time Spark streaming produces a ConsumerRebalanceFailedException 
> and stops receiving messages. After that all consequential RDDs are empty.
> {code}
> 15/01/30 18:18:36 ERROR consumer.ZookeeperConsumerConnector: 
> [terran_vmname-1422670149779-243b4e10], error during syncedRebalance
> kafka.common.ConsumerRebalanceFailedException: 
> terran_vmname-1422670149779-243b4e10 can't rebalance after 4 retries
>       at 
> kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener.syncedRebalance(ZookeeperConsumerConnector.scala:432)
>       at 
> kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anon$1.run(ZookeeperConsumerConnector.scala:355)
> {code}
> The problem is also described in the mailing list: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-when-Spark-streaming-consumes-from-Kafka-td19570.html
> As I understand it's a critical blocker for kafka-spark streaming production 
> use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to