danache opened a new issue #7785: How can I build a conv block using other 
conv's weights.
URL: https://github.com/apache/incubator-mxnet/issues/7785
 
 
   Hi.I m trying reimplement from torch7 to mxnet.Since I am new to mxnet and I 
encounter several prombles I could not find solution in documents.
   The torch code are as follows:
   ```
   local function AttentionIter(numIn, inp , lrnSize, itersize)
       local pad = math.floor(lrnSize/2)
       local U = nnlib.SpatialConvolution(numIn,1, 3, 3, 1,1,1,1)(inp)
       local spConv = nnlib.SpatialConvolution(1,1,lrnSize,lrnSize,1,1,pad,pad)
       -- need to share the parameters and the gradParameters as well
       local spConv_clone = 
spConv:clone('weight','bias','gradWeight','gradBias')
   
       local Q={}
       local C = {}
       for i=1,itersize do
           local conv 
           local Q_tmp
   
           if i==1 then
               conv = spConv(U)
           else
               conv = spConv_clone(Q[i-1])
           end
           table.insert(C,conv)
           Q_tmp = nn.Sigmoid()(nn.CAddTable(true)({C[i], U}))
           table.insert(Q,Q_tmp)
       end
   
       local pfeat = nn.CMulTable(){inp, nn.Replicate(numIn,   2){Q[itersize]}}
       return pfeat 
   end
   ```
   
   I follow the answer in  
[#7758](https://github.com/apache/incubator-mxnet/issues/7758)
   so I init a class like this:
   ```
   class AttentionIter(gluon.HybridBlock):
       def __init__(self, numIn, inp , lrnSize, itersize, **kwargs):
           super(AttentionIter, self).__init__(**kwargs)
           self.pad = np.floor(lrnSize / 2)
           self.itersize = itersize
           self.inp = inp
           self.numIn = numIn
   
           with self.name_scope():
               self.U = gluon.nn.Conv2D(1,kernel_size=3,strides=1,padding=1)
               self.spConv = 
gluon.nn.Conv2D(1,kernel_size=lrnSize,strides=1,padding=self.pad)
               Q = []
               C = []
               for i in range(1,itersize+1):
                   conv = None
                   if i == 1:
                       conv = self.spConv(self.U)
                   else:
                       conv = self.spConv(Q[i - 1])
                   C.append(conv)
                   q_tmp =
   
   
   
       def hybrid_forward(self, F, x):
           # F is a function space that depends on the type of x
           # If x's type is NDArray, then F will be mxnet.nd
           # If x's type is Symbol, then F will be mxnet.sym
           print('type(x): {}, F: {}'.format(
               type(x).__name__, F.__name__))
   
           return x
   ```
   but I have no idea about how to reuse parameter from a conv net just like:
   ```
    local spConv_clone = spConv:clone('weight','bias','gradWeight','gradBias')
   
   ```
   besides, I am comfusing about how to add get "Q_tmp"  since I have to 
implement addtable in hybrid_forward function but Q_tmp need the results.
   Thank you !
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to