[ 
https://issues.apache.org/jira/browse/SPARK-29396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16949669#comment-16949669
 ] 

Imran Rashid commented on SPARK-29396:
--------------------------------------

The memory monitor plugin I wrote (why I pushed the executor plugin) was useful 
on the driver too -- it just used the hack I mentioned above, of creating a 
SparkListener which ignored events.

I think [~lucacanali] also has some use cases involving collecting custom 
metrics.

You can write custom code in the driver -- but I think the idea is that this 
allows you to turn the plugins on without requiring users to touch their code.  
For example, a cluster admin would want a plugin for metric collection, and 
they could enable it for users in precompiled jobs, or for users that only 
interact via a SQL interface, etc.  Similar to how you can call 
{{sc.addSparkListener()}} inside your driver program, or you could set the conf 
{{spark.extraListeners}}.

> Extend Spark plugin interface to driver
> ---------------------------------------
>
>                 Key: SPARK-29396
>                 URL: https://issues.apache.org/jira/browse/SPARK-29396
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Marcelo Masiero Vanzin
>            Priority: Major
>
> Spark provides an extension API for people to implement executor plugins, 
> added in SPARK-24918 and later extended in SPARK-28091.
> That API does not offer any functionality for doing similar things on the 
> driver side, though. As a consequence of that, there is not a good way for 
> the executor plugins to get information or communicate in any way with the 
> Spark driver.
> I've been playing with such an improved API for developing some new 
> functionality. I'll file a few child bugs for the work to get the changes in.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to