Try sc.getExecutorStorageStatus().length

SparkContext's getExecutorMemoryStatus or getExecutorStorageStatus will
give you back an object per executor - the StorageStatus objects are what
drives a lot of the Spark Web UI.

https://spark.apache.org/docs/1.0.1/api/scala/index.html#org.apache.spark.SparkContext


On Thu, Jul 24, 2014 at 11:16 AM, Nicolas Mai <nicolas....@gmail.com> wrote:

> Hi,
>
> Is there a way to get the number of slaves/workers during runtime?
>
> I searched online but didn't find anything :/ The application I'm working
> will run on different clusters corresponding to different deployment stages
> (beta -> prod). It would be great to get the number of slaves currently in
> use, in order set the level of parallelism and RDD partitions, based on
> that
> number.
>
> Thanks!
> Nicolas
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-the-number-of-slaves-tp10604.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to