Thanks Pascal your suggestion works greatly!!
Il giorno giovedì 11 maggio 2017 17:40:53 UTC+2, Pascal Lamblin ha scritto:
>
> When you do:
>
> self.W -= self.rate * gz
>
> You are not updating the value of the shared variable that was in
> self.W. You are creating a new symbolic variable representing
> "self.W - self.rate * gz", and re-assigning the member of Test to
> point to that new variable.
>
> Instead, you want to pass "updates" to theano.function.
>
> For instance:
> def start(self, x, y):
> new_W = self.W
> for x in range(5):
> z = ...
> gz = ...
> new_W -= self.rate * gz
> return z, (self.W, new_W)
>
> ...
> cost, update = test.start(x, y)
> ...
> train = theano.function(
> ...,
> updates=[update])
>
>
> On Thu, May 11, 2017, Giuseppe Angora wrote:
> > Hi,
> > I'm tying to resolve the follow problem: a theano function has as
> outputs
> > the value that a class method return after has made a while loop, within
> > which a parameter is updated:
> >
> > import theanoimport theano.tensor as Timport numpy as npimport copy
> > theano.config.exception_verbosity = 'high'
> > class Test(object):
> > def __init__(self):
> > self.rate=0.01
> > W_val=40.00
> > self.W=theano.shared(value=W_val, borrow=True)
> > def start(self, x, y):
> > for i in range(5):
> > z=T.mean(x*self.W/y)
> > gz=T.grad(z, self.W)
> > self.W-=self.rate*gz
> > return z
> >
> > x_set=np.array([1.,2.,1.,2.,1.,2.,1.,2.,1.,2.])
> > y_set=np.array([1,2,1,2,1,2,1,2,1,2])
> > x_set = theano.shared(x_set, borrow=True)
> > y_set = theano.shared(y_set, borrow=True)
> > y_set=T.cast(y_set, 'int32')
> > batch_size=2
> >
> > x = T.dvector('x')
> > y = T.ivector('y')
> > index = T.lscalar()
> >
> > test = Test()
> > cost=test.start(x,y)
> >
> > train = theano.function(
> > inputs=[index],
> > outputs=cost,
> > givens={
> > x: x_set[index * batch_size: (index + 1) * batch_size],
> > y: y_set[index * batch_size: (index + 1) * batch_size]
> > })
> > for i in range(5):
> > result=train(i)
> > print(result)
> >
> > this is the result of the print:
> >
> >
> 39.9600000008940739.9600000008940739.9600000008940739.9600000008940739.96000000089407
>
>
> >
> > Now the gradient of mean(x*W/y) is equal to 1 (because x and y always
> have the same value). So the first time i should have 39.95, than 39.90 and
> so on... Why i always have the same result??
> >
> > Thanks
> >
> > --
> >
> > ---
> > You received this message because you are subscribed to the Google
> Groups "theano-users" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to [email protected] <javascript:>.
> > For more options, visit https://groups.google.com/d/optout.
>
>
> --
> Pascal
>
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.