[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r344929252 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -77,22 +77,27 @@ def get_tvm_output(graph_def, input_data, target, ctx, output_shape=None, output return tvm_output.asnumpy() -def get_caffe2_output(model, x, dtype='float32'): -import caffe2.python.onnx.backend -prepared_backend = caffe2.python.onnx.backend.prepare(model) -W = {model.graph.input[0].name: x.astype(dtype)} -c2_out = prepared_backend.run(W)[0] -return c2_out +def get_onnxruntime_output(model, x, dtype='float32'): +import onnxruntime.backend +rep = onnxruntime.backend.prepare(model, 'CPU') +x = x.astype(dtype) +ort_out = rep.run(x)[0] +return ort_out def verify_onnx_forward_impl(graph_file, data_shape, out_shape): dtype = 'float32' x = np.random.uniform(size=data_shape) model = onnx.load_model(graph_file) -c2_out = get_caffe2_output(model, x, dtype) -for target, ctx in ctx_list(): -tvm_out = get_tvm_output(model, x, target, ctx, out_shape, dtype) -tvm.testing.assert_allclose(c2_out, tvm_out, rtol=1e-5, atol=1e-5) +try: +c2_out = get_onnxruntime_output(model, x, dtype) +except onnx.onnx_cpp2py_export.checker.ValidationError as e: Review comment: Oh interesting. The checker must be using the OpSet-1 version of DepthToSpace, which doesn't have a mode argument. You should try updating your version of onnxruntime to 1.0.0 (the same that tvm will use after the CI update). Alternatively for our test we could just use the default mode argument (i.e. not specifying it). I think that would be better than having a try except. We can add testing for mode some time in the future when we update to a newer onnx version. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r344929252 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -77,22 +77,27 @@ def get_tvm_output(graph_def, input_data, target, ctx, output_shape=None, output return tvm_output.asnumpy() -def get_caffe2_output(model, x, dtype='float32'): -import caffe2.python.onnx.backend -prepared_backend = caffe2.python.onnx.backend.prepare(model) -W = {model.graph.input[0].name: x.astype(dtype)} -c2_out = prepared_backend.run(W)[0] -return c2_out +def get_onnxruntime_output(model, x, dtype='float32'): +import onnxruntime.backend +rep = onnxruntime.backend.prepare(model, 'CPU') +x = x.astype(dtype) +ort_out = rep.run(x)[0] +return ort_out def verify_onnx_forward_impl(graph_file, data_shape, out_shape): dtype = 'float32' x = np.random.uniform(size=data_shape) model = onnx.load_model(graph_file) -c2_out = get_caffe2_output(model, x, dtype) -for target, ctx in ctx_list(): -tvm_out = get_tvm_output(model, x, target, ctx, out_shape, dtype) -tvm.testing.assert_allclose(c2_out, tvm_out, rtol=1e-5, atol=1e-5) +try: +c2_out = get_onnxruntime_output(model, x, dtype) +except onnx.onnx_cpp2py_export.checker.ValidationError as e: Review comment: Oh interesting. The checker must be using the OpSet-1 version of DepthToSpace, which doesn't have a mode argument. Maybe for our test we should just use the default mode argument (i.e. not specifying it). I think that would be better than having a try except. We can add testing for mode some time in the future when we update to a newer onnx version. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r344929252 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -77,22 +77,27 @@ def get_tvm_output(graph_def, input_data, target, ctx, output_shape=None, output return tvm_output.asnumpy() -def get_caffe2_output(model, x, dtype='float32'): -import caffe2.python.onnx.backend -prepared_backend = caffe2.python.onnx.backend.prepare(model) -W = {model.graph.input[0].name: x.astype(dtype)} -c2_out = prepared_backend.run(W)[0] -return c2_out +def get_onnxruntime_output(model, x, dtype='float32'): +import onnxruntime.backend +rep = onnxruntime.backend.prepare(model, 'CPU') +x = x.astype(dtype) +ort_out = rep.run(x)[0] +return ort_out def verify_onnx_forward_impl(graph_file, data_shape, out_shape): dtype = 'float32' x = np.random.uniform(size=data_shape) model = onnx.load_model(graph_file) -c2_out = get_caffe2_output(model, x, dtype) -for target, ctx in ctx_list(): -tvm_out = get_tvm_output(model, x, target, ctx, out_shape, dtype) -tvm.testing.assert_allclose(c2_out, tvm_out, rtol=1e-5, atol=1e-5) +try: +c2_out = get_onnxruntime_output(model, x, dtype) +except onnx.onnx_cpp2py_export.checker.ValidationError as e: Review comment: Oh interesting. The checker must be using the OpSet-1 version of DepthToSpace, which doesn't have a mode argument. Maybe for our test we should just use the default mode argument (i.e. not specifying it). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r344565585 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -77,22 +77,27 @@ def get_tvm_output(graph_def, input_data, target, ctx, output_shape=None, output return tvm_output.asnumpy() -def get_caffe2_output(model, x, dtype='float32'): -import caffe2.python.onnx.backend -prepared_backend = caffe2.python.onnx.backend.prepare(model) -W = {model.graph.input[0].name: x.astype(dtype)} -c2_out = prepared_backend.run(W)[0] -return c2_out +def get_onnxruntime_output(model, x, dtype='float32'): +import onnxruntime.backend +rep = onnxruntime.backend.prepare(model, 'CPU') +x = x.astype(dtype) +ort_out = rep.run(x)[0] +return ort_out def verify_onnx_forward_impl(graph_file, data_shape, out_shape): dtype = 'float32' x = np.random.uniform(size=data_shape) model = onnx.load_model(graph_file) -c2_out = get_caffe2_output(model, x, dtype) -for target, ctx in ctx_list(): -tvm_out = get_tvm_output(model, x, target, ctx, out_shape, dtype) -tvm.testing.assert_allclose(c2_out, tvm_out, rtol=1e-5, atol=1e-5) +try: +c2_out = get_onnxruntime_output(model, x, dtype) +except onnx.onnx_cpp2py_export.checker.ValidationError as e: Review comment: Last one I promise, but I don't think we want to be doing try excepts here. If tests fail its important to throw an error so it can be fixed and won't sneak into the master branch. Did you need to add this because you were hitting ValidationErrors? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r344549188 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -85,6 +85,13 @@ def get_caffe2_output(model, x, dtype='float32'): return c2_out Review comment: Let's just remove the caffe2 runtime and use onnxruntime everywhere instead. Although we could do it in a later PR, we might as well do it now. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r343818683 ## File path: python/tvm/relay/frontend/onnx.py ## @@ -466,6 +466,76 @@ def _impl_v5(cls, inputs, attr, params): static_shape.asnumpy().astype('int32'))) return out + +class DepthToSpace(OnnxOpConverter): +""" Operator converter for DepthToSpace. +""" + +@classmethod +def _impl_v1(cls, inputs, attr, params): Review comment: This implementation is for the OpSet 11 definition of DepthToSpace. OpSet 1 is slightly different, for example it has no mode attribute. Please change this to impl_v11 to reflect this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-tvm] jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth
jwfromm commented on a change in pull request #4271: [Relay][Frontend][ONNX] operator support: DepthToSpace, SpaceToDepth URL: https://github.com/apache/incubator-tvm/pull/4271#discussion_r343821615 ## File path: tests/python/frontend/onnx/test_forward.py ## @@ -142,6 +142,123 @@ def test_reshape(): tvm.testing.assert_allclose(ref_shape, tvm_out.shape) +def verify_depth_to_space(indata, outdata, mode="DCR"): +node = onnx.helper.make_node('DepthToSpace', + inputs=['x'], + outputs=['y'], + blocksize=2, + mode=mode) + +graph = helper.make_graph([node], + "depth_to_space_test", + inputs=[helper.make_tensor_value_info("x", TensorProto.FLOAT, list(indata.shape))], + outputs=[helper.make_tensor_value_info("y", TensorProto.FLOAT, list(outdata.shape))]) +model = helper.make_model(graph, producer_name='depth_to_space_test') + +for target, ctx in ctx_list(): +tvm_out = get_tvm_output(model, [indata], target, ctx, outdata.shape, 'float32') + +tvm.testing.assert_allclose(outdata, tvm_out) + + +def test_depth_to_space(): +# CRD mode +# (1, 8, 2, 3) input tensor +x = np.array(0., 1., 2.], Review comment: I'm not a big fan of using hardcoded test tensors like this. Would it be possible to just generate a random numpy tensor for test data? Although numpy doesn't have its own depth to space or space to depth operators, you could consider using `get_caffe2_output` on the Onnx model to test against. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services