Do you need to take derivatives through the activation? If not, then you 
could use switch, i.e.

x = some theano variable
threshold = .5
x_binary = T.switch(x > theshold, 1., 0.)

On Wednesday, July 12, 2017 at 10:27:32 AM UTC-7, [email protected] wrote:
>
> In the binarized network github code (), Matthieu used stochastic 
> binarization. I'm wondering how to define just a simple binary activation 
> instead of stochastic in theano?
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to