Using my approach, you would actually need two theano functions:
- one to set the value of self.old to the actual shape it needs to be to
be compatible with the data, that you would call once
- once that updates the value according to the training procedure, that
would be called in the main loop.

Fred's suggested approach was to have a lazy condition inside the graph
that would replicate the value of self.old to be as big as data on the
first iteration. Something like:

old = ifelse(tensor.neq(self.old.shape, (1, 1, 1, 1)),
             self.old,
             tensor.alloc(0., *data.shape))
diff = data - old
add_update(self.old, data)

On Thu, Nov 10, 2016, Peter O'Connor wrote:
> Thanks Pascal,
> 
> I tried your approach
> 
> class TemporalDifference(object):
> 
>     def __init__(self, shape):
>         self.old = None
> 
>     def __call__(self, data):
>         if self.old is None:
>             self.old = theano.shared(np.zeros((1, )*data.ndim))
>         diff = data - self.old
>         add_update(self.old, data)
>         return diff
> 
> 
> , but that leads to a different problem - that is, when I compile and run 
> on test data (shaped (3, 4, 5, 6)), I get:
> 
> ValueError: Input dimension mis-match. (input[0].shape[0] = 3, input[1].
> shape[0] = 1)
> Because (1, 1, 1, 1) doesn't match the shape of the data.
> 
> If I try to enable this by making the state variable broadcastable:
> 
> class TemporalDifference(object):
> 
>     def __init__(self, shape):
>         self.old = None
> 
>     def __call__(self, data):
>         if self.old is None:
>             self.old = theano.shared(np.zeros((1, )*data.ndim), 
> broadcastable=(True, )*data.ndim)
>         diff = data - self.old
>         add_update(self.old, data)
>         return diff
> 
> 
> 
> Then I get: 
> TypeError: ('An update must have the same type as the original shared 
> variable (shared_var=<TensorType(float64, (True, True, True, True))>, 
> shared_var.type=TensorType(float64, (True, True, True, True)), 
> update_val=<TensorType(float64, 4D)>, update_val.type=TensorType(float64, 
> 4D)).', 'If the difference is related to the broadcast pattern, you can 
> call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to 
> remove broadcastable dimensions.')
> 
> 
> So now I'm wondering if there's any way to do this.
> 
> 
> On Wednesday, November 9, 2016 at 5:53:58 PM UTC+1, Pascal Lamblin wrote:
> >
> > Hi, 
> >
> > It is not possible to initialize a shared variable without a shape, 
> > since it needs a value. 
> >
> > However, the shape of a shared variable is not fixed (only the number of 
> > dimensions and dtype are). 
> >
> > So you could create a shared variable with shape (1, 1, ...) for 
> > instance, and then either call set_value() or a function with 
> > updates=... to properly initialize it when you actually know its shape. 
> >
> > On Wed, Nov 09, 2016, Peter O'Connor wrote: 
> > > Hi all, I'm implementing a "temporal difference", which is just this: 
> > > 
> > > class TemporalDifference(object): 
> > > 
> > >     def __init__(self, shape): 
> > >         self.old = theano.shared(np.zeros(shape)) 
> > > 
> > >     def __call__(self, data): 
> > >         diff = data - self.old 
> > >         add_update(self.old, data) 
> > >         return diff 
> > > 
> > > 
> > > It would be really nice not to have to pass in the shape in advance, as 
> > it 
> > > can be a bit difficult to figure out sometimes.  Is there some way to do 
> > > this without having to know the shape in advance? 
> > > 
> > > Thanks. 
> > > 
> > > -- 
> > > 
> > > --- 
> > > You received this message because you are subscribed to the Google 
> > Groups "theano-users" group. 
> > > To unsubscribe from this group and stop receiving emails from it, send 
> > an email to [email protected] <javascript:>. 
> > > For more options, visit https://groups.google.com/d/optout. 
> >
> >
> > -- 
> > Pascal 
> >
> 
> -- 
> 
> --- 
> You received this message because you are subscribed to the Google Groups 
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.


-- 
Pascal

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to