That's because Theano shared variables are symbolic and still do not update
in the usual way as in python code. You will need to use the `updates`
arguments when creating the theano function. I suggest read more in the
intro here: http://deeplearning.net/software/theano/tutorial/examples.html
On Thursday, 11 May 2017 12:40:13 UTC+1, Giuseppe Angora wrote:
>
> Hi,
> I'm tying to resolve the follow problem: a theano function has as outputs
> the value that a class method return after has made a while loop, within
> which a parameter is updated:
>
> import theanoimport theano.tensor as Timport numpy as npimport copy
> theano.config.exception_verbosity = 'high'
> class Test(object):
> def __init__(self):
> self.rate=0.01
> W_val=40.00
> self.W=theano.shared(value=W_val, borrow=True)
> def start(self, x, y):
> for i in range(5):
> z=T.mean(x*self.W/y)
> gz=T.grad(z, self.W)
> self.W-=self.rate*gz
> return z
>
> x_set=np.array([1.,2.,1.,2.,1.,2.,1.,2.,1.,2.])
> y_set=np.array([1,2,1,2,1,2,1,2,1,2])
> x_set = theano.shared(x_set, borrow=True)
> y_set = theano.shared(y_set, borrow=True)
> y_set=T.cast(y_set, 'int32')
> batch_size=2
>
> x = T.dvector('x')
> y = T.ivector('y')
> index = T.lscalar()
>
> test = Test()
> cost=test.start(x,y)
>
> train = theano.function(
> inputs=[index],
> outputs=cost,
> givens={
> x: x_set[index * batch_size: (index + 1) * batch_size],
> y: y_set[index * batch_size: (index + 1) * batch_size]
> })
> for i in range(5):
> result=train(i)
> print(result)
>
> this is the result of the print:
>
> 39.9600000008940739.9600000008940739.9600000008940739.9600000008940739.96000000089407
>
> Now the gradient of mean(x*W/y) is equal to 1 (because x and y always have
> the same value). So the first time i should have 39.95, than 39.90 and so
> on... Why i always have the same result??
>
> Thanks
>
>
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.