juliusshufan opened a new pull request #9793: Enable the reporting of 
cross-entropy or nll loss value when training CNN network using the models 
defined by example/image-classification
URL: https://github.com/apache/incubator-mxnet/pull/9793
 
 
   ## Description ##
   MxNET already implements the loss computation at Python layer (in 
python/mxnet/metric.py) 
   For most of the CNN models used for image-classification, as softmax is 
commonly used as the output, the cross-entropy or likelihood loss value is 
helpful to monitor the convergence trend during training.
   The current implementation of example/image-classification/fit.py already 
provides an extensible implementation to report some useful information during 
training, such as accuracy. 
   My submission utilizes this design and enabling the report of cross-entropy 
or nll loss value during training.
    
   ## Checklist ##
   ### Essentials ###
   - [ ] Passed code style checking (`make lint`)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   All the code changes are in example/image-classification/common/fit.py
   A new argument '--loss' is introduced, and it can be set as 'ce' or 
'nll-loss' corresponding to cross-entropy loss and negative likelihood loss 
respectively.
   Taking the train_cifar10.py as an example, the loss value will be reported 
if the below **bold code line** added,
   parser.set_defaults(
       ......
       **loss           = 'ce'**
       ......
   )
   The output log will be the following pattern:
   INFO:root:Epoch[0] Batch [380]  Speed: 504.42 samples/sec       
accuracy=0.087109       top_k_accuracy_5=0.469531       
**cross-entropy=2.305971**
   INFO:root:Epoch[0] Train-accuracy=0.098437
   INFO:root:Epoch[0] Train-top_k_accuracy_5=0.506250
   INFO:root:Epoch[0] **Train-cross-entropy=2.304529**
   INFO:root:Epoch[0] Time cost=100.989
   INFO:root:Epoch[0] Validation-accuracy=0.098101
   INFO:root:Epoch[0] Validation-top_k_accuracy_5=0.486946
   INFO:root:Epoch[0] **Validation-cross-entropy=2.302873**
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to