e gradient be the logistic function?
>
> Is my understanding that the gradient of the softrelu should go down
> to zero as Lim x -> -Inf Which is not the case with the above
> definition which goes to -Inf as Lim x- > -Inf
>
> https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
>
>
> Pedro.
>
--
Matthias Seeger
ome guidance on how to do that.
Bye, Matthias​
--
Matthias Seeger
Hello,
I got a few PRs merged into MXNet already (github mseeger). Could you grant
me access to slack, and give me details on the channel?
--
Matthias Seeger