Jacek Lewandowski created SPARK-11882:
-----------------------------------------

             Summary: Allow for running Spark applications against a custom 
coarse grained scheduler
                 Key: SPARK-11882
                 URL: https://issues.apache.org/jira/browse/SPARK-11882
             Project: Spark
          Issue Type: Wish
          Components: Spark Core, Spark Submit
            Reporter: Jacek Lewandowski
            Priority: Minor


SparkContext makes a decision which scheduler to use according to the Master 
URI. How about running applications against a custom scheduler? Such a custom 
scheduler would just extend {{CoarseGrainedSchedulerBackend}}. 

The custom scheduler would be created by a provided factory. Factories would be 
defined in the configuration like 
{{spark.scheduler.factory.<name>=<factory-class>}}, where {{name}} is the 
scheduler name. {{SparkContext}}, once it learns that master address is not for 
standalone, Yarn, Mesos, local or any other predefined scheduler, it would 
resolve scheme from the provided master URI and look for the scheduler factory 
with the name equal to the resolved scheme. 

For example:
{{spark.scheduler.factory.custom=org.a.b.c.CustomSchedulerFactory}}
then Master address would be {{custom://192.168.1.1}}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to