Hi, all  

I’m running a Hadoop cluster on AWS EC2,  

I would like to dynamically resizing the cluster so as to reduce the cost, is 
there any solution to achieve this?  

E.g. I would like to cut the cluster size with a half, is it safe to just 
shutdown the instances (if some tasks are just running on them, can I rely on 
the speculative execution to re-run them on other nodes?)

I cannot use EMR, since I’m running a customized version of Hadoop  

Best,  

--  
Nan Zhu
School of Computer Science,
McGill University

Reply via email to