ptrendx opened a new pull request #17966: Omit kNullOp req when comparing 
changed NDArrays in static_shape=True backward of CachedOp
URL: https://github.com/apache/incubator-mxnet/pull/17966
 
 
   ## Description ##
   When doing `autograd.backward` on a `HybridBlock` with constant parameters, 
arrays for gradient outputs for those parameters are not provided by the 
frontend (since they are not needed) and instead they are temporarily allocated 
each time backward is executed. 
   However, `CachedOp` hybridized with `static_shape=True` checks for all 
input/output arrays to match the saved ones. The fact that the output for those 
not-real gradients is temporarily generated each time might result in a 
mismatch and regeneration of engine ops during each iteration, which introduces 
big gaps in the execution.
   Since the `arrays` in the CachedOp are saved in the state, it is actually OK 
to reuse the already saved `NDArray` even if there is a mismatch, as long as 
the `kNullOp` req is set, which this PR does.
   
   @zheng-da @eric-haibin-lin 
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [x] To the best of my knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ## Comments ##
   - Since the effect is purely performance, I'm not sure how to test this 
change
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to