This may be within your yarn constraints, but you can look at the configuration 
parameters of your yarn


On 7/25/2019 20:23,Amit Sharma<resolve...@gmail.com> wrote:
I have cluster with 26 nodes having 16 cores on each. I am running a spark job 
with 20 cores but i did not understand why my application get 1-2 cores on 
couple of machines why not it just run on two nodes like node1=16 cores and 
node 2=4 cores . but cores are allocated like node1=2 node =1---------node 14=1 
like that. Is there any conf property i need to change. I know with dynamic 
allocation we can use below but without dynamic allocation is there any?
--conf "spark.dynamicAllocation.maxExecutors=2"





Thanks
Amit

Reply via email to