dongjoon-hyun opened a new pull request, #53134:
URL: https://github.com/apache/spark/pull/53134

   ### What changes were proposed in this pull request?
   
   This PR aims to increase `spark.kubernetes.allocation.batch.size` to 20 from 
10 in Apache Spark 4.2.0.
   
   ### Why are the changes needed?
   
   Since Apache Spark 4.0.0, Apache Spark uses `10` as the default value of 
executor allocation batch size. This PR aims to increase it further in 2025.
   - https://github.com/apache/spark/pull/49681
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, the users will see faster Spark job resource allocation. The migration 
guide is updated correspondingly.
   
   ### How was this patch tested?
   
   Pass the CIs.
   
   ### Was this patch authored or co-authored using generative AI tooling?
   
   No.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to