15:16:46 +0800
Subject: Re: Session for connections?
To: as...@live.com
CC: user@spark.apache.org
That is your call. If you think it is not a problem to have large
number of open but idle connections to your data store, then it is
probably okay to let them hang around until the executor
:33:49 -0800
Subject: Re: Session for connections?
To: as...@live.com
CC: user@spark.apache.org
Also, this is covered in the streaming programming guide in bits and pieces.
http://spark.apache.org/docs/latest/streaming-programming-guide.html#design-patterns-for-using-foreachrdd
On Thu, Dec
Hi,
I was wondering if there's any way of having long running session type
behaviour in spark. For example, let's say we're using Spark Streaming to
listen to a stream of events. Upon receiving an event, we process it, and if
certain conditions are met, we wish to send a message to rabbitmq.
You could create a lazily initialized singleton factory and connection
pool. Whenever an executor starts running the firt task that needs to
push out data, it will create the connection pool as a singleton. And
subsequent tasks running on the executor is going to use the
connection pool. You will
That makes sense. I'll try that.
Thanks :)
From: tathagata.das1...@gmail.com
Date: Thu, 11 Dec 2014 04:53:01 -0800
Subject: Re: Session for connections?
To: as...@live.com
CC: user@spark.apache.org
You could create a lazily initialized singleton factory and connection
pool. Whenever
:)
From: tathagata.das1...@gmail.com
Date: Thu, 11 Dec 2014 04:53:01 -0800
Subject: Re: Session for connections?
To: as...@live.com
CC: user@spark.apache.org
You could create a lazily initialized singleton factory and connection
pool. Whenever an executor starts running the firt task
as...@live.com wrote:
That makes sense. I'll try that.
Thanks :)
From: tathagata.das1...@gmail.com
Date: Thu, 11 Dec 2014 04:53:01 -0800
Subject: Re: Session for connections?
To: as...@live.com
CC: user@spark.apache.org
You could create a lazily initialized singleton factory and connection