Re: Spark StreamingContext Question

2018-03-07 Thread रविशंकर नायर
Got it, thanks On Wed, Mar 7, 2018 at 4:32 AM, Gerard Maas wrote: > Hi, > > You can run as many jobs in your cluster as you want, provided you have > enough capacity. > The one streaming context constrain is per job. > > You can submit several jobs for Flume and some

Re: Spark StreamingContext Question

2018-03-07 Thread Gerard Maas
Hi, You can run as many jobs in your cluster as you want, provided you have enough capacity. The one streaming context constrain is per job. You can submit several jobs for Flume and some other for Twitter, Kafka, etc... If you are getting started with Streaming with Spark, I'd recommend you to

Re: Spark StreamingContext Question

2018-03-07 Thread sagar grover
Hi, You can have multiple streams under same streaming context and process them accordingly. With regards, Sagar Grover Phone - 7022175584 On Wed, Mar 7, 2018 at 9:26 AM, ☼ R Nair (रविशंकर नायर) < ravishankar.n...@gmail.com> wrote: > Hi all, > > Understand from documentation that, only one