My objective is to update the activation parameters coming from an 
DenseLayer. The activations shall be replaced from an pre-initalized array.

P.S : I'm using lasagne but my problem is actually based on theano's 
gradient operation

Example:
DenseLayer with 4 units:
At training stage n assuming that each units has these activations to the 
next layer :

a1 = 0.32
a2 = 0.33
a3 = 1.2
a4 = -1.1

My pre defined array = [0.3, 1.0, 1.3, -1.0, -2.0 ....]

A layer that takes the input from this dense layer has to update this 
activations from the pre defined array so this new layer's activations will 
be

a1 = 0.3
a2 = 0.3
a3 = 1.3
a4 = -1.0

Here is my code to do this operation

class DummyLayer(Layer):

    def __init__(self, incoming, array, **kwargs):
        super(DummyLayer, self).__init__(incoming, **kwargs)
        self.bins = self.add_param(array, shape=array.shape, name='array', 
trainable=False)

    def get_output_for(self, input, **kwargs):
      array = input
      array = T.reshape(array, (-1, 6))  # bathsize x 6 parameters from 
dense lay
      dist = self.bins.transpose() - theta.flatten()
      arg_min =   T.argmin(abs(dist), axis=0)
      arg_min_c = T.cast(arg_min, 'int32')
      return T.reshape(self.array[0, arg_min_c], (-1, 6))

However, theano gives the DisconnectedType error.
Is this about that this "self.array[0, arg_min_c]" operation has no 
gradient?. I tried T.choose() operation as well but this operation also has 
no defined grad operation.

The possible solution that I'm thinking is that I have define this array 
selection operation as a theano operation, but I have no idea how to define 
this selection operations gradient.

Has anyone faced this problem before ?

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to