[GitHub] 7oud commented on issue #9420: add use_global_stats in nn.BatchNorm

2018-02-26 Thread GitBox
7oud commented on issue #9420: add use_global_stats in nn.BatchNorm URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368716803 @thbupt batch size in training is 8, and in inference is usually 1. This is

[GitHub] 7oud commented on issue #9420: add use_global_stats in nn.BatchNorm

2018-02-26 Thread GitBox
7oud commented on issue #9420: add use_global_stats in nn.BatchNorm URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368714486 @tornadomeet it seems that, but I cannot give the conclusion, bcz the dataset is too small to giving truth

[GitHub] 7oud commented on issue #9420: add use_global_stats in nn.BatchNorm

2018-02-26 Thread GitBox
7oud commented on issue #9420: add use_global_stats in nn.BatchNorm URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368713958 @thbupt Actually I did like what you said, but the same data batch has different output when using forward(is_train=False) and

[GitHub] 7oud commented on issue #9420: add use_global_stats in nn.BatchNorm

2018-02-26 Thread GitBox
7oud commented on issue #9420: add use_global_stats in nn.BatchNorm URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368709872 @thbupt I found in some small dataset training tasks such as segmentation, the inference result is worse than training when using BatchNorm

[GitHub] 7oud commented on issue #9420: add use_global_stats in nn.BatchNorm

2018-02-25 Thread GitBox
7oud commented on issue #9420: add use_global_stats in nn.BatchNorm URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368320260 @szha @tornadomeet if training with use_global_stats=True, it seemed all the moving_mean = 0 and moving_var = 1 in the trained model, is is