Hi Dibyendu,
Thank you for your reply.
I am using Kafka https://github.com/dibbhatt/kafka-spark-consumer which
uses spark-core and spark-streaming *1.2.2*
Spark cluster on which I am running application is* 1.3.1* . I will test it
with latest changes .
Yes Underlying BlockManager gives error a
Seems to be related to this JIRA :
https://issues.apache.org/jira/browse/SPARK-3612 ?
On Tue, Jun 9, 2015 at 7:39 AM, Dibyendu Bhattacharya <
dibyendu.bhattach...@gmail.com> wrote:
> Hi Snehal
>
> Are you running the latest kafka consumer from github/spark-packages ? If
> not can you take the l
Hi Snehal
Are you running the latest kafka consumer from github/spark-packages ? If
not can you take the latest changes. This low level receiver will make
attempt to keep trying if underlying BlockManager gives error. Are you see
those retry cycle in log ? If yes then there is issue writing blocks
All,
I am using Kafka Spark Consumer
https://github.com/dibbhatt/kafka-spark-consumer in spark streaming job .
After spark streaming job runs for few hours , all executors exit and I
still see status of application on SPARK UI as running
Does anyone know cause of this exception and how to fix