Thanks Pascal,

I tried your approach

class TemporalDifference(object):

    def __init__(self):
        self.old = None

    def __call__(self, data):
        if self.old is None:
            self.old = theano.shared(np.zeros((1, )*data.ndim))
        diff = data - self.old
        add_update(self.old, data)
        return diff


, but that leads to a different problem - that is, when I compile and run 
on test data (shaped (3, 4, 5, 6)), I get:

ValueError: Input dimension mis-match. (input[0].shape[0] = 3, input[1].
shape[0] = 1)
Because (1, 1, 1, 1) doesn't match the shape of the data.

If I try to enable this by making the state variable broadcastable:

class TemporalDifference(object):

    def __init__(self):
        self.old = None

    def __call__(self, data):
        if self.old is None:
            self.old = theano.shared(np.zeros((1, )*data.ndim), 
broadcastable=(True, )*data.ndim)
        diff = data - self.old
        add_update(self.old, data)
        return diff



Then I get: 
TypeError: ('An update must have the same type as the original shared 
variable (shared_var=<TensorType(float64, (True, True, True, True))>, 
shared_var.type=TensorType(float64, (True, True, True, True)), 
update_val=<TensorType(float64, 4D)>, update_val.type=TensorType(float64, 
4D)).', 'If the difference is related to the broadcast pattern, you can 
call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to 
remove broadcastable dimensions.')


So now I'm wondering if there's any way to do this.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to