Alvaro Begue: 
<CAF8dVMVMwi65m9jMTsvOa=qzortqz-dedh5494uwzeld9su...@mail.gmail.com>:
>On Tue, May 23, 2017 at 4:51 AM, Hideki Kato <[email protected]> wrote:
>
>> (3) CNN cannot learn exclusive-or function due to the ReLU
>> activation function, instead of traditional sigmoid (tangent
>> hyperbolic).  CNN is good at approximating continuous (analog)
>> functions but Boolean (digital) ones.
>>
>
>Oh, not this nonsense with the XOR function again.
>
>You can see a neural network with ReLU activation function learning XOR
>right here: http://playground.tensorflow.org/#activation=relu&;
>batchSize=10&dataset=xor&regDataset=reg-plane&learningRate=0.01&
>regularizationRate=0&noise=0&networkShape=4,4&seed=0.96791&
>showTestData=false&discretize=false&percTrainData=50&x=true&
>y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=
>false&sinX=false&cosY=false&sinY=false&collectStats=false&
>problem=classification&initZero=false&hideText=false

That NN has no "sharp" edges.  Using sigmoid (hyperbolic tangent) 
activation function, changing weights can change the sharpness 
of the edges of the approximated function.  For ReLU, changing 
weights only changes the slope.

Hideki
-- 
Hideki Kato <mailto:[email protected]>
_______________________________________________
Computer-go mailing list
[email protected]
http://computer-go.org/mailman/listinfo/computer-go

Reply via email to