Here is the doc for defaultParallelism :
/** Default level of parallelism to use when not given by user (e.g.
parallelize and makeRDD). */
def defaultParallelism: Int = {
What if the user changes parallelism ?
Cheers
On Fri, Mar 25, 2016 at 5:33 AM, manasdebashiskar <[email protected]>
wrote:
> There is a sc.sparkDefaultParallelism parameter that I use to dynamically
> maintain elasticity in my application. Depending upon your scenario this
> might be enough.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Best-way-to-determine-of-workers-tp26586p26594.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>