Re: [theano-users] Theano variable sent to for slicing instead of constant while using theano.grad

2018-03-06 Thread Siddhartha Saxena
The dtype of grad_steps and s_ is float64 while self.truncate_gradient is a python float. Sorry I didn't answer it properly previously. Thanks -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop

Re: [theano-users] Theano variable sent to for slicing instead of constant while using theano.grad

2018-03-06 Thread Pascal Lamblin
OK, but what is the `dtype` (data type) of those variables? On 2018-03-06 01:48 PM, Siddhartha Saxena wrote: grad_steps itself is of with value "Elemwise{minimum,no_inplace}.0". So here a tensor that is s_ ( of type Subtensor{::int64}.0}) is being sliced by a variable. Again how it is

Re: [theano-users] Theano variable sent to for slicing instead of constant while using theano.grad

2018-03-06 Thread Siddhartha Saxena
Thanks a lot Pascal, I have solved the problem now, the issue was that self.truncate_gradient was a float instead of being an int. -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving

Re: [theano-users] Theano variable sent to for slicing instead of constant while using theano.grad

2018-03-06 Thread Siddhartha Saxena
grad_steps itself is of with value "Elemwise{minimum,no_inplace}.0". So here a tensor that is s_ ( of type Subtensor{::int64}.0}) is being sliced by a variable. Again how it is reaching there is what i am unable to understand. Thanks -- --- You received this message because you are

Re: [theano-users] Theano variable sent to for slicing instead of constant while using theano.grad

2018-03-06 Thread Pascal Lamblin
OK, thanks. self.truncate_gradient should not be a Python float, it should be an integer. This is probably why the dtype of grad_steps is float64, instead of int64 (or another integer dtype). Do you have any idea why self.truncate_gradient would not be "-1" (the default value)? Did you set