Hi,
I wonder if it possible to define two different theano functions on the
same computation network.
I provide a pseudo-code of my algorithm for clarification:
*# Main loop*:
Line 0: f_getvalues, f_train, trans = build(...)
Line 1: Check value of 'trans'* # t0*
Line 2: for each epoch
Line 3: for each x
Line 4: [c1] = f_getvalues(x)
Line 5: Check value of 'trans'* # t1*
Line 6: [c2] = f_train(x)
Line 7: Check value of 'trans'* # t2*
Line 8: end
Line 9: end
*# Computation graph*
def build(...):
.
.
.
transition = a shared variable with random initialization
cost = sum(transition[idx]) + ...
def update1(cost,params):
gradients = T.grad(cost, params)
for p, g in zip(params, gradients):
updates.append((p, p - 0.005 * g))
def update2(cost,params):
gradients = T.grad(cost, params)
for p, g in zip(params, gradients):
updates.append((p, p - 0.005 * *0.0*))
f_getvalues = theano.function(input = train_input, output = [cost], updates
= update1)
f_train = theano.function(input = train_input, output = [cost], updates =
update2)
return f_getvalues, f_train, transition
I first check the value of transition at Line 1 to see the value it gets by
random initialization, Let's say it was t0.
Then I expect that t1 be equal to t0 as there is no real update in
*update1()*. And, t2 be different because transition should get updated in
*update2()*.
However by debugging the code, what I see is: t0, t1, t2 are all equal for
each sample x. That is the value of transition only changes after calling
f_getvalues() over the next sample x.
What is wrong ?
Thanks in advance,
Hanieh
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.