@winstywang So, if I have pretrained model and I want to freeze first 90% of 
the layers, while on the others  i'd like to use gradient with weight decay, 
how can I do that in mxnet?

[ Full content available at: 
https://github.com/apache/incubator-mxnet/issues/3073 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to