Michael Schmei├čer commented on SPARK-650:

But I'll need to have an RDD to do this, I can't just do it during the 
SparkContext setup - right now, we have multiple sources of RDDs and every 
developer would still need to know that they have to run this code after 
creating an RDD, won't they? Or is there some way to use a "pseudo-RDD" right 
after creation of the SparkContext to execute the init code on the executors?

> Add a "setup hook" API for running initialization code on each executor
> -----------------------------------------------------------------------
>                 Key: SPARK-650
>                 URL: https://issues.apache.org/jira/browse/SPARK-650
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Matei Zaharia
>            Priority: Minor
> Would be useful to configure things like reporting libraries

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to