thbupt commented on issue #9420: add use_global_stats in nn.BatchNorm
URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368721069
@7oud I think 8 is too small for bn, you can try larger bz like 16, 32.
thbupt commented on issue #9420: add use_global_stats in nn.BatchNorm
URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368715232
@7oud how about your batch size? bn seems to prefer large batch size.
This
thbupt commented on issue #9420: add use_global_stats in nn.BatchNorm
URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368714220
@tornadomeet Is there a simple way to set all use_global_status=True as
finetuning. I know one way is to set use_global_status=True for
thbupt commented on issue #9420: add use_global_stats in nn.BatchNorm
URL: https://github.com/apache/incubator-mxnet/pull/9420#issuecomment-368417931
@7oud I have the same question. I think use_global_stats=True should be used
as you finetune some pretrained model such as ResNet, VGG.