Hi everyone.

I would like to know what is momentum used for, I think that has something 
to do with the weights updates, I have been reading information but I do 
not understand at all. It has something to do with the dynamic learning 
rate?

Regards.


El jueves, 6 de marzo de 2014, 17:25:01 (UTC+1), Al Docherty escribió:
>
> Hello again,
>
> I'm considering adding momentum to my neural network implementation. The 
> gradients and updates are calculated as so:
>
> ### OBTAIN PARAMETERS AND GRADIENTS 
>   gparams = []
>   for param in classifier.params:
>     gparam = T.grad(printcost, param)
>     gparams.append(gparam)
>     
>   ### CALCULATE CHANGE IN WEIGHTS
>   updates = [] 
>   for param, gparam in zip(classifier.params, gparams):
>     updates.append((param, param-eta * gparam))
>
>
> I know I need to add the momentum term to the updates.append line. But how 
> do I store an old set of gradients? 
>
> Al
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to