RuRo commented on issue #18054: Bump ONNX version to 1.5.0
URL: https://github.com/apache/incubator-mxnet/pull/18054#issuecomment-615884610
 
 
   @QueensGambit 
   
   > Should we instead aim for supporting the latest ONNX release 1.6.0 (Sep 
28, 2019) instead?
   
   I chose 1.5.0 as that was the smallest version, that supported the operators 
I needed. Are there any particular operators/features you need from 1.6.0? I've 
just tried upgrading to 1.6.0 and there are quite a lot of tests that are 
failing with 1.6.0, so it wouldn't be a seamless upgrade.
   
   The main issue with upgrading ONNX is that in newer opset versions, ONNX 
started moving some of the static "attributes" to be dynamic inputs. (For 
example, the `Pad` operator in 1.6.0 accepts the pad sizes as an actual `int64` 
tensor instead of an attribute, that is just a static list of ints). It's 
pretty easy to **export** mxnet operators to such a format (just create a 
constant tensor), but it's AFAIK not currently possible to **import** such an 
operator, since `mx.sym.pad` expects the pad sizes to be known in advance.
   
   I think, there may be some developments in the new `numpy` compatible API, 
that could allow for such operators to be imported, but I haven't looked into 
it yet.
   
   I personally only want to export from MxNet to ONNX and don't care at all 
about importing from ONNX to MxNet, however I am not sure, if a PR which drops 
the support for importing the `Pad` operator from ONNX would ever get accepted. 
Even in this PR, I had to drop importing the newer opset versions of `Slice` 
and `TopK` and I am not sure, if this will get accepted without any complaints.
   
   > Will this PR also allow dynamic shape export to ONNX as it is already 
available in Pytorch?
   
   This PR doesn't add any new functionality and I think it should stay that 
way. This PR will just bump the ONNX version and any new operators and 
functionality, that can be implemented with the newer version will be in their 
separate PRs.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to