Hi, Do current or future(2.0) spark dynamic allocation have capability to request a container with varying resource requirements based on various factor? Few factors I can think of is based on stage and data its processing it can either ask for more CPUs or more Memory. i.e. new executor can have different number of CPU cores or memory available for all of its task. That way spark can process data skew with heavier executor by assigning more Memory or CPUs to new executors.
Thanks Nirav -- [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/> <https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn] <https://www.linkedin.com/company/xactly-corporation> [image: Twitter] <https://twitter.com/Xactly> [image: Facebook] <https://www.facebook.com/XactlyCorp> [image: YouTube] <http://www.youtube.com/xactlycorporation>