Hi,
You can start multiple spark apps per cluster. You will have one stream
context per app.


Em 24 de dez de 2016 18:22, "shyla deshpande" <deshpandesh...@gmail.com>
escreveu:

> Hi All,
>
> Thank you for the response.
>
> As per
>
> https://docs.cloud.databricks.com/docs/latest/databricks_
> guide/index.html#07%20Spark%20Streaming/15%20Streaming%20FAQs.html
>
> There can be only one streaming context in a cluster which implies only
> one streaming job.
>
> So, I am still confused. Anyone having more than 1 spark streaming app in
> a cluster running at the same time, please share your experience.
>
> Thanks
>
> On Wed, Dec 14, 2016 at 6:54 PM, Akhilesh Pathodia <
> pathodia.akhil...@gmail.com> wrote:
>
>> If you have enough cores/resources, run them separately depending on your
>> use case.
>>
>>
>> On Thursday 15 December 2016, Divya Gehlot <divya.htco...@gmail.com>
>> wrote:
>>
>>> It depends on the use case ...
>>> Spark always depends on the resource availability .
>>> As long as you have resource to acoomodate ,can run as many spark/spark
>>> streaming  application.
>>>
>>>
>>> Thanks,
>>> Divya
>>>
>>> On 15 December 2016 at 08:42, shyla deshpande <deshpandesh...@gmail.com>
>>> wrote:
>>>
>>>> How many Spark streaming applications can be run at a time on a Spark
>>>> cluster?
>>>>
>>>> Is it better to have 1 spark streaming application to consume all the
>>>> Kafka topics or have multiple streaming applications when possible to keep
>>>> it simple?
>>>>
>>>> Thanks
>>>>
>>>>
>>>
>

Reply via email to