Hello

I am new to theano. Can you help me how to use get_value() function in 
order to save weights and biases after pertaining phase in both DBN and 
SDA? 
I am really appreciate you

On Wednesday, October 22, 2014 at 11:56:00 PM UTC+3:30, Kin wrote:
>
> *Thank you very much for your help, Pascal!* I am now able to save the 
> weights and biases by simply using the get_value function as suggested to 
> acquire the weights and biases of each RBM layers at the end of the 
> pretraining phase. If loading the pretrained model is desired, I skipped 
> the pretraining portion of the code entirely and the values (W, hbias, 
> vbias) are reassigned using set_value() prior to finetuning.
>
> - Cage
>
> On Tuesday, October 21, 2014 11:51:44 AM UTC-5, Pascal Lamblin wrote:
>>
>> On Mon, Oct 20, 2014, Cage Lawrence wrote: 
>> > I have used my own dataset and it almost took 8 hours just to pretrain 
>> the 
>> > model (without finetuning). I would love to save the weights and biases 
>> for 
>> > each layer to a file, then at the next time to use it I could simply 
>> load 
>> > the model from the file and skipping the pretraining process. What 
>> other 
>> > considerations should be noted if I were to save the weights and biases 
>> > *after* finetuning as well? 
>>
>> One easy way, but maybe not the most flexible, would be to have a `save` 
>> method in DBN, that would get all the (vbias, W, hbias) parameter 
>> for all the RBMs, and for the final softmax layer (W and b). For 
>> each of these shared variables, you can then call numpy.save on the 
>> value returned by that_var.get_value(), so each of them is saved in a 
>> different .npy file. 
>>
>> Then, in a `load` method, you could load them all and put them in shared 
>> variables, then go through the logic of `__init__` again, but specifying 
>> all the parameters for all of the components, instead of leaving some 
>> of them with the default value. For instance, you will have to provide 
>> vbias to the constructor of `RBM`. 
>>
>> After fine-tuning is essentially the same thing, except that the value 
>> for the visible biases would not be used (and not change) during 
>> fine-tuning. 
>>
>> If you want to try different hyper-parameters (sizes of layers, and so 
>> on), you can also create a file containing those values (maybe as a 
>> dictionary) in DBN.save, and use it in DBN.load. 
>>
>> > I have read the Theano documentation on saving/loading models, and also 
>> > searched this group but the people who encountered the same problem did 
>> not 
>> > obtain a particularly detailed reply. Either that, or they have found 
>> > solutions very specific to topics outside of DBNs. What solutions would 
>> you 
>> > recommend? Should I be looking into editing the codes within RBM.py 
>> instead 
>> > of DBN.py? 
>>
>> > 
>> > *Below are the threads/topics that I've read (but they didn't help my 
>> > case...):* 
>> > Theano Official Documentation: 
>> > 
>> http://deeplearning.net/tutorial/gettingstarted.html#loading-and-saving-models
>>  
>> > Saving Weights and Biases for *MLP*: 
>> > 
>> https://groups.google.com/forum/#!searchin/theano-users/save/theano-users/gWbjkIpSjC4/mL-747Yz-P0J
>>  
>> > Saving Weights and Biases for *Convolutional Networks: * 
>> > 
>> https://groups.google.com/forum/#!searchin/theano-users/saving/theano-users/tdF_j_JyD24/2iNU8ezProoJ
>>  
>> > Saving & Resuming Model Training: 
>> > 
>> https://groups.google.com/forum/#!searchin/theano-users/saving$20parameters$20dbn/theano-users/2VwXqQgnBwA/-Xyu-uR4KXwJ
>>  
>> > Save and Load Parameters of *DBN*: 
>> > 
>> https://groups.google.com/forum/#!searchin/theano-users/save$20parameters$20dbn/theano-users/GLcmlLoONic/u8KFFuG9JEMJ
>>  
>> > 
>> > Thank you very much! 
>> > 
>> > - Cage 
>> > 
>> > -- 
>> > 
>> > --- 
>> > You received this message because you are subscribed to the Google 
>> Groups "theano-users" group. 
>> > To unsubscribe from this group and stop receiving emails from it, send 
>> an email to [email protected]. 
>> > For more options, visit https://groups.google.com/d/optout. 
>>
>> -- 
>> Pascal 
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to