RE: Handling worker batch processing during driver shutdown

2015-03-13 Thread Tathagata Das
worker batch processing during driver shutdown Can you access the batcher directly? Like is there is there a handle to get access to the batchers on the executors by running a task on that executor? If so, after the streamingContext has been stopped (not the SparkContext), then you can use

RE: Handling worker batch processing during driver shutdown

2015-03-13 Thread Jose Fernandez
and fails silently. I really appreciate your help, but it looks like I’m back to the drawing board on this one. From: Tathagata Das [mailto:t...@databricks.com] Sent: Thursday, March 12, 2015 7:53 PM To: Jose Fernandez Cc: user@spark.apache.org Subject: Re: Handling worker batch processing during

RE: Handling worker batch processing during driver shutdown

2015-03-12 Thread Jose Fernandez
Subject: Re: Handling worker batch processing during driver shutdown Can you access the batcher directly? Like is there is there a handle to get access to the batchers on the executors by running a task on that executor? If so, after the streamingContext has been stopped (not the SparkContext

Re: Handling worker batch processing during driver shutdown

2015-03-12 Thread Tathagata Das
using Spark 1.2 on CDH 5.3. I stop the application with yarn application -kill appID. *From:* Tathagata Das [mailto:t...@databricks.com] *Sent:* Thursday, March 12, 2015 1:29 PM *To:* Jose Fernandez *Cc:* user@spark.apache.org *Subject:* Re: Handling worker batch processing during driver

Re: Handling worker batch processing during driver shutdown

2015-03-12 Thread Tathagata Das
Can you access the batcher directly? Like is there is there a handle to get access to the batchers on the executors by running a task on that executor? If so, after the streamingContext has been stopped (not the SparkContext), then you can use `sc.makeRDD()` to run a dummy task like this.