ption in task 0.0 in stage 0.0 (TID 0)
> java.lang.AssertionError: assertion failed: Failed to get records for
> spark-executor-null mytopic2 0 2 after polling for 512
> ==
>
> -Original Message-
> From: Cody Koeninger [mailto:c...@koeninger.org]
> Sent: 2016年10月
mytopic2 0 2 after polling for 512
==
-Original Message-
From: Cody Koeninger [mailto:c...@koeninger.org]
Sent: 2016年10月13日 9:31
To: Haopu Wang
Cc: user@spark.apache.org
Subject: Re: Kafka integration: get existing Kafka messages?
Look at the presentation and blog post linked from
https://github.com
Wang
> Cc: user@spark.apache.org
> Subject: Re: Kafka integration: get existing Kafka messages?
>
>
>
> its set to none for the executors, because otherwise they wont do exactly
> what the driver told them to do.
>
>
>
> you should be able to set up the driver consumer
Cc: user@spark.apache.org
Subject: Re: Kafka integration: get existing Kafka messages?
its set to none for the executors, because otherwise they wont do exactly what
the driver told them to do.
you should be able to set up the driver consumer to determine batches however
you want, though
its set to none for the executors, because otherwise they wont do exactly
what the driver told them to do.
you should be able to set up the driver consumer to determine batches
however you want, though.
On Wednesday, October 12, 2016, Haopu Wang wrote:
> Hi,
>
>
>
> I want to read the existing
Hi,
I want to read the existing Kafka messages and then subscribe new stream
messages.
But I find "auto.offset.reset" property is always set to "none" in
KafkaUtils. Does that mean I cannot specify "earliest" property value
when create direct stream?
Thank you!