Github user liyinan926 commented on the issue:
    `spark.kubernetes.executor.cores` has nothing to do with dynamic resource 
allocation. It's just a way of letting users specify a value for the cpu 
resource request that conforms to Kubernetes convention and is only read/used 
when determining the cpu request. 


To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to