Hi,
I am trying to implement a dynamic learning rate, but I do not understand
how the learning rate is updated in the code that you have shared. I can't
see where is updated the learning rate in the way: learning rate = learning
rate * 0.8
In addition, I have tried to implement the code but it does not works, but
I would like to understand the code in order to realize what is wrong in my
code.
Thank you in advance.
Beatriz.
El lunes, 6 de octubre de 2014, 17:14:11 (UTC+2), Ofir Levy escribió:
>
> ok I think I got it
>
> learning_rate = 0.1
>
> l_r = T.scalar('l_r', dtype=theano.config.floatX)
>
> updates = []for param_i, grad_i in zip(params, grads):
> updates.append((param_i, param_i - l_r * grad_i))
>
> train_model = theano.function([index,l_r], cost, updates = updates,
> givens={
> x: train_set_x[index * batch_size: (index + 1) * batch_size],
> y: train_set_y[index * batch_size: (index + 1) * batch_size]})
>
> and in the training loop:
>
> cost_ij = train_model(minibatch_index, learning_rate)
>
>
>
> On Monday, October 6, 2014 5:38:33 PM UTC+3, Ofir Levy wrote:
>>
>> for the CNN example we currently have the following code:
>>
>> learning_rate = 0.1
>>
>> updates = []for param_i, grad_i in zip(params, grads):
>> updates.append((param_i, param_i - learning_rate * grad_i))train_model =
>> theano.function([index], cost, updates = updates,
>> givens={
>> x: train_set_x[index * batch_size: (index + 1) * batch_size],
>> y: train_set_y[index * batch_size: (index + 1) * batch_size]})
>>
>> and in the training loop:
>>
>> cost_ij = train_model(minibatch_index)
>>
>>
>> can you kindly tell me how to change it to have a adaptive learning rate?
>>
>>
>>
>>
>>
>>
>> On Thursday, July 17, 2014 9:48:24 PM UTC+3, Frédéric Bastien wrote:
>>>
>>> Make a theano variable that is the learning rate and pass it as an input
>>> to your theano function.
>>>
>>> You could also use a shared variable is you don't want to pass it
>>> explicitly each time, but only change it from time to time:
>>>
>>>
>>> http://deeplearning.net/software/theano/tutorial/examples.html#using-shared-variables
>>>
>>> Fred
>>>
>>>
>>> On Thu, Jul 17, 2014 at 2:42 PM, <[email protected]> wrote:
>>>
>>>> Hi,
>>>>
>>>> I would to change the learning rate in learning procedure. A large
>>>> learning rate is used in the initial stage and small is used in later. How
>>>> can I do.
>>>>
>>>> Thanks
>>>> Jiancheng
>>>>
>>>> --
>>>>
>>>> ---
>>>> You received this message because you are subscribed to the Google
>>>> Groups "theano-users" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.