Re: Infinite Loop in Spark

2016-10-27 Thread Mark Hamstra
Using a single SparkContext for an extended period of time is how
long-running Spark Applications such as the Spark Job Server work (
https://github.com/spark-jobserver/spark-jobserver).  It's an established
pattern.

On Thu, Oct 27, 2016 at 11:46 AM, Gervásio Santos  wrote:

> Hi guys!
>
> I'm developing an application in Spark that I'd like to run continuously.
> It would execute some actions, sleep for a while and go again. I was
> thinking of doing it in a standard infinite loop way.
>
> val sc = 
> while (true) {
>   doStuff(...)
>   sleep(...)
> }
>
> I would be running this (fairly light weight) application on a cluster,
> that would also run other (significantly heavier) jobs. However, I fear
> that this kind of code might lead to unexpected beahavior; I don't know if
> keeping the same SparkContext active continuously for a very long time
> might lead to some weird stuff happening.
>
> Can anyone tell me if there is some problem with not "renewing" the Spark
> context or is aware of any problmes with this approach that I might be
> missing?
>
> Thanks!
>


Infinite Loop in Spark

2016-10-27 Thread Gervásio Santos
Hi guys!

I'm developing an application in Spark that I'd like to run continuously.
It would execute some actions, sleep for a while and go again. I was
thinking of doing it in a standard infinite loop way.

val sc = 
while (true) {
  doStuff(...)
  sleep(...)
}

I would be running this (fairly light weight) application on a cluster,
that would also run other (significantly heavier) jobs. However, I fear
that this kind of code might lead to unexpected beahavior; I don't know if
keeping the same SparkContext active continuously for a very long time
might lead to some weird stuff happening.

Can anyone tell me if there is some problem with not "renewing" the Spark
context or is aware of any problmes with this approach that I might be
missing?

Thanks!