[ 
https://issues.apache.org/jira/browse/SPARK-19593?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15867214#comment-15867214
 ] 

Sarath Chandra Jiguru commented on SPARK-19593:
-----------------------------------------------

See the Type of the ticket is question. In KinesisReceiver.scala, currently it 
is possible to use default values of KinesisClientLibConfiguration. Due to 
this, even the stream is capable of serving the required read rate, the kinesis 
spark streaming consumer is not able to use it.

Don't you think, it is highly needed to allow spark consumers to tweak these 
configurations?

> Records read per each kinesis transaction
> -----------------------------------------
>
>                 Key: SPARK-19593
>                 URL: https://issues.apache.org/jira/browse/SPARK-19593
>             Project: Spark
>          Issue Type: Question
>          Components: DStreams
>    Affects Versions: 2.0.1
>            Reporter: Sarath Chandra Jiguru
>            Priority: Trivial
>
> The question is related to spark streaming+kinesis integration
> Is there a way to provide a kinesis consumer configuration. Ex: Number  of 
> records read per each transaction etc. 
> Right now, even though, I am eligible to read 2.8G/minute, I am restricted to 
> read only 100MB/minute, as I am not able to increase the default number of 
> records in each transaction.
> I have raised a question in stackoverflow as well, please look into it:
> http://stackoverflow.com/questions/42107037/how-to-alter-kinesis-consumer-properties-in-spark-streaming
> Kinesis stream setup:
> open shards: 24
> write rate: 440K/minute
> read rate: 1.42M/minute
> read byte rate: 85 MB/minute. I am allowed to read around 2.8G/minute(24 
> Shards*2 MB*60 Seconds)
> Reference: 
> http://docs.aws.amazon.com/streams/latest/dev/kinesis-record-processor-additional-considerations.html



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to