Hi Gurus,

I am running a spark job and in one stage it's creating 9 tasks .So even if
I have 25 executors only 9s are getting utilized.

The other executors going to dead status , how can I increase the no of
tasks so all my executors can be utilized.Any help/guidance is appreciated
:)
<http://apache-spark-user-list.1001560.n3.nabble.com/file/t8535/s1.jpg> 
<http://apache-spark-user-list.1001560.n3.nabble.com/file/t8535/s2.jpg> 



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to