Hi, all I’m deploying a spark cluster on EC2, which is shared by multiple users
I would like to resize the cluster when most of users are silent for saving the cost I’m running one worker on each instance, if I directly shutdown some instances, and some tasks are just running on there, what will happen on Spark? it will recover those tasks with something like speculative execution? or the job will unfortunately fail? Best, -- Nan Zhu School of Computer Science, McGill University
