Hi Anand,

This is probably already handled by the RDD.pipe() operation. It will spawn a 
process and let you feed data to it through its stdin and read data through 
stdout.

Matei

On May 29, 2014, at 9:39 AM, ansriniv <ansri...@gmail.com> wrote:

> I have a requirement where for every Spark executor threadpool thread, I need
> to launch an associated external process.
> 
> My job will consist of some processing in the Spark executor thread and some
> processing by its associated external process with the 2 communicating via
> some IPC mechanism.
> 
> Is there a hook in Spark where I can put in my code to create / destroy
> these external processes corresponding to the creation / destruction of
> executor thread pool threads.
> 
> Thanks
> Anand
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-hook-to-create-external-process-tp6526.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to