Hi all,
We're running Spark 1.5.0 on EMR 4.1.0 in AWS and consuming from Kinesis.
We saw the following exception today - it killed the Spark "step":
org.apache.spark.SparkException: Could not read until the end sequence
number of the range
We guessed it was because our Kinesis stream didn't
tioned, or perhaps you could write the
> processed RDD back to Kinesis, and have the Lambda function read the
> Kinesis stream and write to Redshift?
>
> On Thu, Sep 17, 2015 at 5:48 PM, Alan Dipert <a...@dipert.org> wrote:
>
>> Hello,
>> We are using Spark Streaming 1.4.
Hello,
We are using Spark Streaming 1.4.1 in AWS EMR to process records from
Kinesis. Our Spark program saves RDDs to S3, after which the records are
picked up by a Lambda function that loads them into Redshift. That no data
is lost during processing is important to us.
We have set our Kinesis