chrishkchris commented on issue #572: hotfix: cpp softmax not impl
URL: https://github.com/apache/singa/pull/572#issuecomment-573575646
 
 
   @dcslin  I tested the examples/autograd/mlp.py (multilayer perception) and 
there is such error:
   ```
   ubuntu@ip-172-31-26-47:~/singa/examples/autograd$ python3 mlp.py
   train_data_shape: (400, 2)
   train_label_shape: (400, 2)
   training loss =  6.682101
   WARNING: Logging before InitGoogleLogging() is written to STDERR
   F0113 09:20:05.180658 12032 tensor_math_cpp.h:357] Check failed: a > 0.f 
(-nan vs. 0)
   *** Check failure stack trace: ***
   Aborted (core dumped)
   ```
   However if I uses softmax_cross_entropy instead of softmax + cross_entropy 
it will be totally okay, i.e.
   loss = autograd.softmax_cross_entropy(x, target)
   The good output will be:
   ```
   ubuntu@ip-172-31-26-47:~/singa/examples/autograd$ python3 mlp.py
   train_data_shape: (400, 2)
   train_label_shape: (400, 2)
   training loss =  0.6908062
   training loss =  0.5960194
   training loss =  0.57797414
   training loss =  0.55334115
   training loss =  0.48568404
   training loss =  0.38458923
   training loss =  0.30776194
   training loss =  0.24188559
   training loss =  0.18657134
   training loss =  0.15864176
   training loss =  0.13929243
   ```
   
   
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to