A big thanks to Qi Qiao < https://github.com/mirocody > for making it easy for 
users to set up a cluster for dynamic training using cloudformation.

From: "Kumar, Vikas" <[email protected]>
Date: Thursday, November 29, 2018 at 10:26 AM
To: "[email protected]" <[email protected]>
Subject: [Launch Announcement] Dynamic training with Apache MXNet

Hello MXNet community,

MXNet users can now use Dynamic Training(DT) for Deep learning models with 
Apache MXNet. DT helps to reducing training cost and training time by adding 
elasticity to the distributed training cluster. DT also helps in increasing 
instance pool utilization. With DT unused instances can be used to speed up 
training and then instances can be removed from training cluster at a later 
time to be used by some other application.
For details, refer to DT 
blog<https://aws.amazon.com/blogs/machine-learning/introducing-dynamic-training-for-deep-learning-with-amazon-ec2/>.
Developers should be able to integrate Dynamic training in their existing 
distributed training code, with introduction of few extra lines of 
code<https://github.com/awslabs/dynamic-training-with-apache-mxnet-on-aws#writing-a-distributed-training-script>.

Thank you for all the contributors – Vikas Kumar <https://github.com/Vikas89 >, 
Haibin Lin < https://github.com/eric-haibin-lin>, Andrea Olgiati < 
https://github.com/andreaolgiati/><https://github.com/andreaolgiati/> , Mu Li < 
https://github.com/mli >, Hagay Lupesko <https://github.com/lupesko>, Markham 
Aaron < https://github.com/aaronmarkham > , Sergey Sokolov < 
https://github.com/Ishitori> , Qi Qiao < https://github.com/mirocody >

This is an effort towards making training neural networks cheap and fast. We 
welcome your contributions to the repo - 
https://github.com/awslabs/dynamic-training-with-apache-mxnet-on-aws . We would 
love to hear feedback and ideas in this direction.

Thanks
Vikas

Reply via email to