QueensGambit commented on issue #18054: Bump ONNX version to 1.5.0
URL: https://github.com/apache/incubator-mxnet/pull/18054#issuecomment-616215616
 
 
   > I personally only want to export from MxNet to ONNX and don't care at all 
about importing from ONNX to MxNet, however I am not sure, if a PR which drops 
the support for importing the Pad operator from ONNX would ever get accepted. 
Even in this PR, I had to drop importing the newer opset versions of Slice and 
TopK and I am not sure, if this will get accepted without any complaints.
   
   I'm also interested in MXNet to ONNX export to allow native TensorRT and 
possible onnxjs inference but I agree that this shouldn't break the ONNX to 
MXNet import.
   
   > This PR doesn't add any new functionality and I think it should stay that 
way. This PR will just bump the ONNX version and any new operators and 
functionality, that can be implemented with the newer version will be in their 
separate PRs.
   
   I asked for a ONNX 1.6.0 support to make use of the bug fixes from 1.6.0 and 
to allow export with a dynamic batch size.
   However you're right that this PR shouldn't add to many things at once and 
that dynamic inputs could be added later.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to