Hello Russel, Recently i was aiming to create ReLU for hidden layer for a RBM but run out of time, i had found some interesting links that may help you: Profes. Hinton lecture (Especially the 2nd lecture at 16:58-26:50): http://videolectures.net/mlss09uk_hinton_dbn/ And this posts: http://stackoverflow.com/questions/24149765/gaussian-rbm-with-nrlu-hidden-units-in-dbn https://www.quora.com/What-is-special-about-rectifier-neural-units-used-in-NN-learning ...about how to sample. Concerning the energy function, my guess is to treat ReLU as Binary units. There is also MORB: https://groups.google.com/forum/#!topic/theano-users/hFzhDjrWr9I ... which implements ReLUs ,to check out code if needed. (I tried to use MORB with ReLU but with no success).
Hope it helps, and good luck.... Akataios On Monday, 27 February 2017 03:42:52 UTC+2, Russell Tsuchida wrote: > > Hi all, > > Using the code at the tutorial > <http://deeplearning.net/tutorial/rbm.html#rbm> as a starting point, I've > been able to implement an RBM with Gaussian visible layers simply by > overwriting the free_energy, sample_v_given_h and get_reconstruction_cost > methods. As per Nair and Hinton > <https://www.cs.toronto.edu/~hinton/absps/reluICML.pdf>, I'm trying to > implement an RBM with ReLU hidden units. How should this be done? I'm not > sure how to sample p(v|h), since interpreting max(0,x) as a probability > must have some caveats. Also, how should the free_energy function be > modified? > > Thanks, > Russell. > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
