[
https://issues.apache.org/jira/browse/SPARK-650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15731628#comment-15731628
]
Michael Schmeißer commented on SPARK-650:
-----------------------------------------
No, it's not just about propagating information - some code actually needs to
be run. We have some static utilities which need to be initialized, but they
don't know anything about Spark but are rather provided by external libraries.
Thus, we need to actually trigger the initialization on all executors. The only
other way that I see is to wrap all access to those external utilities with
something on our side that is Spark-aware and initializes them if needed. But I
think compared to this, our current solution is better.
> Add a "setup hook" API for running initialization code on each executor
> -----------------------------------------------------------------------
>
> Key: SPARK-650
> URL: https://issues.apache.org/jira/browse/SPARK-650
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Reporter: Matei Zaharia
> Priority: Minor
>
> Would be useful to configure things like reporting libraries
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]