2. I notice that once I start ssc.start(), my stream starts processing and
continues indefinitely...even if I close the socket on the server end (I'm
using unix command nc to mimic a server as explained in the streaming
programming guide .) Can I tell my stream to detect if it's lost a
On 28 Mar 2014, at 00:34, Scott Clasen scott.cla...@gmail.com wrote:
Actually looking closer it is stranger than I thought,
in the spark UI, one executor has executed 4 tasks, and one has executed
1928
Can anyone explain the workings of a KafkaInputStream wrt kafka partitions
and mapping to
On 28 Mar 2014, at 01:44, Tathagata Das tathagata.das1...@gmail.com wrote:
The more I think about it the problem is not about /tmp, its more about the
workers not having enough memory. Blocks of received data could be falling
out of memory before it is getting processed.
BTW, what is the
On 28 Mar 2014, at 02:10, Scott Clasen scott.cla...@gmail.com wrote:
Thanks everyone for the discussion.
Just to note, I restarted the job yet again, and this time there are indeed
tasks being executed by both worker nodes. So the behavior does seem
inconsistent/broken atm.
Then I added