tz-hmc opened a new issue #8591: How do I make a siamese network with 
pretrained models (esp. keeping the weights the same?)
URL: https://github.com/apache/incubator-mxnet/issues/8591
 
 
   ## Description
   How do I ensure the weights are kept the same? Can I unpack the internal 
layers somehow and set the weights of each to the same variable? My apologies, 
I'm new to MXNet. Would really appreciate the help, thanks!
   
   ``
   sym1, arg_params1, aux_params1 = mx.model.load_checkpoint('resnet-152', 0)
   sym2, arg_params2, aux_params2 = mx.model.load_checkpoint('resnet-152', 0)
   layer1 = sym1.get_internals()
   layer2 = sym2.get_internals()
   for i in range(len(layer1)): # will something like this work?
       arg_params1[i] = arg_params2[i]
   ``
   
   Relevant answers, but not specific enough to my particular problem:
   https://github.com/apache/incubator-mxnet/issues/772    siamese networks
   https://github.com/apache/incubator-mxnet/issues/6791     extract layers as 
variables
   https://github.com/apache/incubator-mxnet/issues/557     set weights to be 
same

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to