am get processed, and then just be done and
> terminate, rather than wait another period and try and process any more
> data
> from Kafka?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-stream-all-data-out-of-a-Kafka-topic
econds(...));
>>>>
>>>> The batchDuration parameter is "The time interval at which streaming
>>>> data
>>>> will be divided into batches". Can this be worked somehow to cause Spark
>>>> Streaming to just get all the available data
gt;>>> JavaStreamingContext jssc = new JavaStreamingContext(sparkConf,
>>>> Durations.milliseconds(...));
>>>>
>>>> The batchDuration parameter is "The time interval at which streaming
>>>> data
>>>>
just get all the available data, then let all the RDD's
>>> within
>>> the Kafka discretized stream get processed, and then just be done and
>>> terminate, rather than wait another period and try and process any more
>>> data
>>> from Kafka?
>> will be divided into batches". Can this be worked somehow to cause Spark
>> Streaming to just get all the available data, then let all the RDD's
>> within
>> the Kafka discretized stream get processed, and then just be done and
>> terminate, rather than wait ano
tches". Can this be worked somehow to cause Spark
>> Streaming to just get all the available data, then let all the RDD's within
>> the Kafka discretized stream get processed, and then just be done and
>> terminate, rather than wait another period and try and process any
View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-stream-all-data-out-of-a-Kafka-topic-once-then-terminate-job-tp22698.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
hin
the Kafka discretized stream get processed, and then just be done and
terminate, rather than wait another period and try and process any more data
from Kafka?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-stream-all-data-out-of-a-Kafka-topic-once-t