I am trying to do something similar but the batch size of the two models i 
am trying to update are different....when i used the same thing 
updates=(updates1 + updates2) i am getting an error. I just need to train 1 
model, and copy the trained parameters of this model into another model. I 
dont need need to train both the models simultaneously. Can u suggest me 
what I can do to fix this. ??..

On Wednesday, May 18, 2016 at 7:33:56 PM UTC-4, taleb alashkar wrote:
>
> thank you Pascal, it works well :)
>
> On Wed, May 18, 2016 at 6:44 PM, Pascal Lamblin <[email protected] 
> <javascript:>> wrote:
>
>> You can combine the lists of updates, simply passing "updates=(updates1
>> + updates2)", as long as classifier1.params and classifier2.params are
>> disjoint.
>>
>>
>> On Wed, May 18, 2016, taleb alashkar wrote:
>> > Dear all,
>> >
>> > can I call more than one updates in one theano function?
>> > I am training tow different network using the same data and the cost
>> > function is the average of both of them.
>> > I wan to optimize the average cost according the parameters of two
>> > different classifiers (classifier1, classifier2) which are multi layer
>> > perceptron MLP  .
>> >
>> > I am doing this:
>> >
>> >  classifier1= MLP(..)
>> >   classifier2=MLP(..)
>> >
>> >     cost1 = (
>> >         classifier1.negative_log_likelihood(y)
>> >         + L1_reg * classifier1.L1
>> >
>> >     )
>> >
>> >
>> >     cost2 = (
>> >         classifier2.negative_log_likelihood(y)
>> >         + L1_reg * classifier1.L1
>> >     )
>> >
>> >
>> >
>> >    cost = 0.3*cost1 + 0.7*cost2
>> >
>> >     gparams1 = [T.grad(cost, param1) for param1 in classifier1.params]
>> >     gparams2 = [T.grad(cost, param2) for param2 in classifier2.params]
>> >
>> >
>> >
>> >     updates1 = [
>> >         (param1, param1 - learning_rate * gparam1)
>> >         for param1, gparam1 in zip(classifier1.params, gparams1)
>> >     ]
>> >
>> >     updates2 = [
>> >         (param2, param2 - learning_rate * gparam2)
>> >         for param2, gparam2 in zip(classifier2.params, gparams2)
>> >     ]
>> >
>> >
>> >      can I call both of updates1 and updates2 in one theano training
>> > function like this?
>> >      if Not please any suggestion or similar example to see?
>> >
>> >
>> >
>> >     train_model = theano.function(
>> >         inputs=[index],
>> >         outputs=cost,
>> >         updates=[updates1,updates2]
>> >         givens={
>> >             x:  train_set_x[index * batch_size: (index + 1) * 
>> batch_size],
>> >             y: train_set_y[index * batch_size: (index + 1) * 
>> batch_size],
>> >
>> >         }
>> >     )
>> >
>> >
>> >
>> > --
>> >
>> > ---
>> > You received this message because you are subscribed to the Google 
>> Groups "theano-users" group.
>> > To unsubscribe from this group and stop receiving emails from it, send 
>> an email to [email protected] <javascript:>.
>> > For more options, visit https://groups.google.com/d/optout.
>>
>>
>> --
>> Pascal
>>
>> --
>>
>> ---
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "theano-users" group.
>> To unsubscribe from this topic, visit 
>> https://groups.google.com/d/topic/theano-users/Teh54QT59DY/unsubscribe.
>> To unsubscribe from this group and all its topics, send an email to 
>> [email protected] <javascript:>.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
>
> -- 
> *Taleb ALASHKAR*
> * Computer Engineering; PhD*
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to