Hello Jim, 

I'm currently working on the same problem using Theano. 
Have you implemented the constrastive wake-sleep algorithm on this library 
and this case, could you tell me some guidances?  

Many thanks, 
François

Le jeudi 14 juillet 2016 11:32:38 UTC+2, Jim O' Donoghue a écrit :
>
> So I'm going to reply to my own question in case it helps anyone else out. 
> Had another look at the paper there, I had forgotten about the contrastive 
> wake-sleep algorithm. That's what's used to train the algorithm completely 
> unsupervised. 
>
> On Tuesday, 12 July 2016 15:40:48 UTC+1, Jim O' Donoghue wrote:
>>
>> Hi There,
>>
>> Just wondering how you would fine-tune a DBN for a completely 
>> unsupervised task i.e. practical implementation of "Fine-tune all the 
>> parameters of this deep architecture with respect to a proxy for the DBN 
>> log- likelihood". 
>>
>> Would this be something like, for example, a negative log likelihood 
>> between the original input and the reconstruction of the data when 
>> propogated entirely up and down the network? What makes the final layer an 
>> rbm and the rest just normally directed. Or would the only way you can do 
>> this be to completely un-roll the network and fine-tune like a deep 
>> autoencoder (as in reducing the dimensionality of data with neural 
>> networks)? 
>>
>> Many thanks,
>> Jim
>>
>>
>>
>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to