[GitHub] piiswrong commented on issue #8373: distribute training in fp16

2017-12-12 Thread GitBox
piiswrong commented on issue #8373: distribute training in fp16 URL: https://github.com/apache/incubator-mxnet/pull/8373#issuecomment-351214158 Closing due to outdated. @rahul003 please guide @solin319 to merge this into the gradient compression framework.

[GitHub] piiswrong commented on issue #8373: distribute training in fp16

2017-10-27 Thread GitBox
piiswrong commented on issue #8373: distribute training in fp16 URL: https://github.com/apache/incubator-mxnet/pull/8373#issuecomment-339889212 My mistake, I meant general compression interface, not general n-bit compression interface. For now we can have 2bit and float16 after merging

[GitHub] piiswrong commented on issue #8373: distribute training in fp16

2017-10-21 Thread GitBox
piiswrong commented on issue #8373: distribute training in fp16 URL: https://github.com/apache/incubator-mxnet/pull/8373#issuecomment-338371778 @eric-haibin-lin @rahul003 I think this should be merged with the 2bit PR to make a more general n-bit gradient compression feature.