Lars Francke commented on SPARK-650:

I also have to disagree with this being a duplicate or obsolete.

[~oarmand] and [~Skamandros] already mentioned reasons regarding the 

About it being obsolete: I have seen multiple clients facing this problem, 
finding this issue and hoping it'd get fixed some day. I would hesitate a guess 
and say that most _users_ of Spark have no JIRA account here and do not 
register or log in just to vote for this issue. That said: This issue is (with 
six votes) in the top 150 out of almost 17k total issues in the Spark project.

As it happens this is a non-trivial thing to implement in Spark (as far as I 
can tell from my limited knowledge of the inner workings) so it's pretty hard 
for a "drive by" contributor to help here.

You had the discussion about community perception on the mailing list (re: 
Spark Improvement Proposals) and this issue happens to be one of those that at 
least I see popping up every once in a while in discussions with clients.

I would love to see this issue staying open as a feature request and have some 
hope that someone someday will implement it.

> Add a "setup hook" API for running initialization code on each executor
> -----------------------------------------------------------------------
>                 Key: SPARK-650
>                 URL: https://issues.apache.org/jira/browse/SPARK-650
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Matei Zaharia
>            Priority: Minor
> Would be useful to configure things like reporting libraries

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to