.
I really appreciate your help, but it looks like I’m back to the drawing
board on this one.
*From:* Tathagata Das [mailto:t...@databricks.com]
*Sent:* Thursday, March 12, 2015 7:53 PM
*To:* Jose Fernandez
*Cc:* user@spark.apache.org
*Subject:* Re: Handling worker batch processing during
and fails silently.
I really appreciate your help, but it looks like I’m back to the drawing board
on this one.
From: Tathagata Das [mailto:t...@databricks.com]
Sent: Thursday, March 12, 2015 7:53 PM
To: Jose Fernandez
Cc: user@spark.apache.org
Subject: Re: Handling worker batch processing during
Subject: Re: Handling worker batch processing during driver shutdown
Can you access the batcher directly? Like is there is there a handle to get
access to the batchers on the executors by running a task on that executor? If
so, after the streamingContext has been stopped (not the SparkContext
using Spark
1.2 on CDH 5.3. I stop the application with yarn application -kill appID.
*From:* Tathagata Das [mailto:t...@databricks.com]
*Sent:* Thursday, March 12, 2015 1:29 PM
*To:* Jose Fernandez
*Cc:* user@spark.apache.org
*Subject:* Re: Handling worker batch processing during driver
Can you access the batcher directly? Like is there is there a handle to get
access to the batchers on the executors by running a task on that executor?
If so, after the streamingContext has been stopped (not the SparkContext),
then you can use `sc.makeRDD()` to run a dummy task like this.