Hi Ray,

In standalone mode, you have this thing called the
SparkDeploySchedulerBackend, which has this thing called the AppClient.
This is the thing on the driver side that already talks to the Master to
register the application.

As for dynamic allocation in standalone mode, I literally *just* created a
patch on Github: https://github.com/apache/spark/pull/7532. Feel free to
have a look if you're interested. :)

-Andrew

2015-07-18 18:47 GMT-07:00 Dogtail Ray <spark.ru...@gmail.com>:

> Hi all,
>
> I am planning to dynamically increase or decrease the number of executors
> allocated to an application during runtime, and it is similar to dynamic
> resource allocation, which is only feasible in Spark on Yarn mode. Any
> suggestions on how to implement this feature in Standalone mode?
>
> My current problem is: I want to send a ADD_EXECUTOR command from
> scheduler module (in CoarseGrainedSchedulerBackend.scala) to deploy module
> (in Master.scala), but don't know how to communicate between the two
> modules.... Great thanks for any suggestions!
>
>

Reply via email to