@prathik-naidu I think that model may actually have been trained without 
normalization. 

I receive worse results if I apply normalization to the data. The top one is: 
`probability=0.060943, class=n01930112 nematode, nematode worm, roundworm` 
which is clearly worse than original one: `probability=0.249607, 
class=n02119022 red fox, Vulpes vulpes`

If I change model to Gluon one (`resnet18 = 
vision.resnet18_v1(pretrained=True)`), then it is clear that it was trained 
with normalization in place. With normalization I get: `probability=8.849780, 
class=n02124075 Egyptian cat`, without I get: `probability=60.358135, 
class=n04270147 spatula`

So, I am not sure how this model was trained. It could be that it actually 
didn't use normalization. What results do you get if you apply normalization?

[ Full content available at: 
https://github.com/apache/incubator-mxnet/issues/12063 ]
This message was relayed via gitbox.apache.org for devnull@infra.apache.org

Reply via email to