[jira] [Commented] (SPARK-23385) Allow SparkUITab to be customized adding in SparkConf and loaded when creating SparkUI

2018-02-11 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16359960#comment-16359960
 ] 

Sean Owen commented on SPARK-23385:
---

I'm not sure about this. See PR.

> Allow SparkUITab to be customized adding in SparkConf and loaded when 
> creating SparkUI
> --
>
> Key: SPARK-23385
> URL: https://issues.apache.org/jira/browse/SPARK-23385
> Project: Spark
>  Issue Type: New Feature
>  Components: Spark Core
>Affects Versions: 2.2.1
>Reporter: Lantao Jin
>Priority: Major
>
> It would be nice if there was a mechanism to allow to add customized 
> SparkUITab (embedded like Jobs, Stages, Storage, Environment, Executors,...) 
> to be registered through SparkConf settings. This would be more flexible when 
> we need display some special information in UI rather than adding the 
> embedded one by one and wait community to merge.
> I propose to introduce a new configuration option, spark.extraUITabs, that 
> allows customized WebUITab to be specified in SparkConf and registered when 
> SparkUI is created. Here is the proposed documentation for the new option:
> {quote}
> A comma-separated list of classes that implement SparkUITab; when 
> initializing SparkUI, instances of these classes will be created and 
> registered to the tabs array in SparkUI. If a class has a two-argument 
> constructor that accepts a SparkUI and AppStatusStore, that constructor will 
> be called; If a class has a single-argument constructor that accepts a 
> SparkUI; otherwise, a zero-argument constructor will be called. If no valid 
> constructor can be found, the SparkUI creation will fail with an exception.
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23385) Allow SparkUITab to be customized adding in SparkConf and loaded when creating SparkUI

2018-02-10 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16359825#comment-16359825
 ] 

Apache Spark commented on SPARK-23385:
--

User 'LantaoJin' has created a pull request for this issue:
https://github.com/apache/spark/pull/20574

> Allow SparkUITab to be customized adding in SparkConf and loaded when 
> creating SparkUI
> --
>
> Key: SPARK-23385
> URL: https://issues.apache.org/jira/browse/SPARK-23385
> Project: Spark
>  Issue Type: New Feature
>  Components: Spark Core
>Affects Versions: 2.2.1
>Reporter: Lantao Jin
>Priority: Major
>
> It would be nice if there was a mechanism to allow to add customized 
> SparkUITab (embedded like Jobs, Stages, Storage, Environment, Executors,...) 
> to be registered through SparkConf settings. This would be more flexible when 
> we need display some special information in UI rather than adding the 
> embedded one by one and wait community to merge.
> I propose to introduce a new configuration option, spark.extraUITabs, that 
> allows customized WebUITab to be specified in SparkConf and registered when 
> SparkUI is created. Here is the proposed documentation for the new option:
> {quote}
> A comma-separated list of classes that implement SparkUITab; when 
> initializing SparkUI, instances of these classes will be created and 
> registered to the tabs array in SparkUI. If a class has a two-argument 
> constructor that accepts a SparkUI and AppStatusStore, that constructor will 
> be called; If a class has a single-argument constructor that accepts a 
> SparkUI; otherwise, a zero-argument constructor will be called. If no valid 
> constructor can be found, the SparkUI creation will fail with an exception.
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org