Thanks TD for the response. Can you please provide more explanation. I am having multiple streams in the spark streaming application (Spark 2.0.2 using DStreams). I know many people using this setting. So your explanation will help a lot of people.
Thanks On Fri, Mar 10, 2017 at 6:24 PM, Tathagata Das <t...@databricks.com> wrote: > That config I not safe. Please do not use it. > > On Mar 10, 2017 10:03 AM, "shyla deshpande" <deshpandesh...@gmail.com> > wrote: > >> I have a spark streaming application which processes 3 kafka streams and >> has 5 output operations. >> >> Not sure what should be the setting for spark.streaming.concurrentJobs. >> >> 1. If the concurrentJobs setting is 4 does that mean 2 output operations >> will be run sequentially? >> >> 2. If I had 6 cores what would be a ideal setting for concurrentJobs in >> this situation? >> >> I appreciate your input. Thanks >> >