thomelane commented on issue #11133: Manipulating nn.Dense(...) layer parameters
URL: 
https://github.com/apache/incubator-mxnet/issues/11133#issuecomment-394900159
 
 
   Hi @lu4,
   
   You can create a clone of your network, and then make adjustments during the 
copy. If you're using a Sequential Block as a container for your network, you 
could create another Sequential Block and add all of the layers from one 
network to the other, which would save redefining the network. You would make 
changes to the necessary layers before adding them to the new Sequential Block.
   
   As I understand the problem, you'll need to change the weights and biases 
for the layer you want to expand, and the weights for the next dense layer (as 
the weights shape depends on the units in the layer before which has changed). 
After constructing the news weights and biases (i.e. padding with 0s), you can 
then use 
[`set_data`](https://mxnet.incubator.apache.org/api/python/gluon/gluon.html?highlight=set_data#mxnet.gluon.Parameter.set_data)
 on the parameters of interest before adding to the new Sequential Block.
   
   Unfortunately I don't think you can't mutate the original network like this, 
because you're changing the shape of the parameters. You'll hit shape assertion 
errors. And you can't just swap out a single layer in the original Sequential 
Block because they don't support assignment.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to