kirkuz commented on issue #2323:
URL: https://github.com/apache/hudi/issues/2323#issuecomment-743110390


   @n3nash I believe that those are defaults for hardware configuration taken 
from spark.dynamicAllocation.enabled = true. I was running that on a cluster of 
1 master and 6 cores AWS EMR r5d.4xlarge with 16vCores and 128GB ram per each. 
This is why:
   
   spark.executor.instances is 6
   spark.executor.cores is 16
   
   Can you advice me how should I try to parametrize it?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to