Re: Back pressure not working on streaming

2019-02-05 Thread Cody Koeninger
That article is pretty old, If you click through the link to the jira mentioned in it, https://issues.apache.org/jira/browse/SPARK-18580 , it's been resolved. On Wed, Jan 2, 2019 at 12:42 AM JF Chen wrote: > > yes, 10 is a very low value for testing initial rate. > And from this article >

Re: Back pressure not working on streaming

2019-01-01 Thread JF Chen
yes, 10 is a very low value for testing initial rate. And from this article https://www.linkedin.com/pulse/enable-back-pressure-make-your-spark-streaming-production-lan-jiang/, it seems spark back pressure is not available for dstream? So ,max rate per partition is the only available back pressure

Re: Back pressure not working on streaming

2019-01-01 Thread Dillon Bostwick
Unsubscribe On Tue, Jan 1, 2019 at 10:03 PM JF Chen wrote: > I have set spark.streaming.backpressure.enabled to true, > spark.streaming.backpressure.initialRate > to 10. > Once my application started, it received 32 million messages on first > batch. > My application runs every 300 seconds,

Re: Back pressure not working on streaming

2019-01-01 Thread HARSH TAKKAR
There is separate property for max rate , by default is is not set, so if you want to limit the max rate you should provide that property a value. Initial rate =10 means it will pick only 10 records per receiver in the batch interval when you start the process. Depending upon the consumption

Back pressure not working on streaming

2019-01-01 Thread JF Chen
I have set spark.streaming.backpressure.enabled to true, spark.streaming.backpressure.initialRate to 10. Once my application started, it received 32 million messages on first batch. My application runs every 300 seconds, with 32 kafka partition. So what's is the max rate if I set initial rate to