[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-02-04 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-582263501
 
 
   > CI is green now, great! Is it ready for final review? @alexwong
   
   Yes I believe so. I think I resolved all of the smaller changes and larger 
refactor efforts by you will await a later PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-31 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-580873337
 
 
   > Reducing all the input sizes might help with memory issues, I don't think 
theres any need to use big 224x224 test data.
   
   I think that would definitely help and worth a try but if single operator 
models are running into issues then I think larger networks would definitely as 
well and all of those require larger input sizes. Still looking into some ways 
to clean up memory.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-30 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-580519028
 
 
   > actually I cannot run torchvision tests in this PR on my 8GB laptop. Maybe 
RAM is the problem?
   
   Yeah I think it's something along that line. Will try to clean up the models 
after every test to see if it fixes it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-30 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-580517249
 
 
   My local test (using CI container) passes but it fails here due to an out of 
memory issue so I think it is an issue with not enough memory on whatever 
machine is running here. Will try some things periodically as I can't really 
reproduce locally. One more thing, I'm not too sure if we want to keep the 
specific tests for other data types as it makes the code kind of ugly and I 
don't see other frontends with similar tests. Perhaps we should move this to 
another file?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-30 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-580391605
 
 
   > @alexwong the CI is not updated even if you update the docker script in 
this PR (see https://docs.tvm.ai/contribute/pull_request.html#ci-environment). 
To update for v1.4, first we need to wait for #4756 to be merged.
   > 
   > In the mean time, you can use
   > 
   > ```python
   > if torch.__version__ != "1.2.0":
   > torch._C._jit_pass_inline(graph)
   > ```
   > 
   > to unblock your testing.
   
   Ah, makes sense. Thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-29 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-579987071
 
 
   I cleaned up all of the simpler fixes and will focus on getting the CI to 
pass (w/ refactored test code based off @jwfromm's comment). 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-27 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-579028784
 
 
   > @alexwong I had a big success in refactoring, the parser itself (except op 
conversion) is about 150 lines and the "main loop" is just:
   > 
   > ```python
   > def get_op_inputs(op_node, outputs, name_map):
   > inputs = []
   > for i in op_node.inputs():
   > inode_name = name_map[i.debugName()]
   > inputs.append(outputs[inode_name])
   > return inputs
   > 
   > outputs = list(input_vars.values())
   > node_name_to_nid = dict(zip(input_vars.keys(), range(len(outputs
   > 
   > for node_name, op_node in ops.items():
   > operator = op_node.kind()
   > if operator == "prim::Constant":
   > node_name_to_nid[node_name] = len(outputs)
   > outputs.append(consts[node_name])
   > elif operator != 'prim::ListConstruct':
   > node_name_to_nid[node_name] = len(outputs)
   > inputs = get_op_inputs(op_node, outputs, node_name_to_nid)
   > call = convert_map[operator](inputs, op_in_types[node_name])
   > outputs.append(call)
   > 
   > body = outputs[-1]
   > func = tvm.relay.Function(_analysis.free_vars(body), body)
   > param = {k: tvm.nd.array(v) for k, v in param_tensors.items()}
   > ```
   > 
   > My updated version is 
[here](https://gist.github.com/masahi/7704856919563c4b8a74bf085686b519)
   > 
   > Maybe this is a too much change for you, I'm happy to send my change as a 
follow up after this PR. We can merge this after you fix the CI issue.
   
   It is a lot of feedback but I think I can manage, just have been sidetracked 
with other things that keep pulling me away from this. I'm not sure about today 
but I should be able to work on this tomorrow. I'm all for simpler code though, 
would you prefer I pull in the changes above in this PR or just try and make 
all of the simpler changes to get this merged first?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-23 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-577564701
 
 
   @masahi Thanks for the help on this! I will address the comments and resume 
work on this tomorrow. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-14 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-574383474
 
 
   > @alexwong I tried your PR locally. With pytorch v1.3 it works, but they 
introduced big change in 
[#28408](https://github.com/pytorch/pytorch/pull/28408) and 
[#28409](https://github.com/pytorch/pytorch/pull/28409), and it broke your PR 
(I may be wrong about which PR broke it). Below is how their IR looks like for 
resnet18 now. My pytorch version is '1.5.0a0+0dbd5c0' (the output of 
torch.**version**).
   > 
   > ```
   > graph(%self.1 : 
__torch__.torch.nn.modules.module.___torch_mangle_66.Module,
   >   %input.1 : Float(1, 3, 224, 224)):
   >   %1452 : __torch__.torch.nn.modules.module.___torch_mangle_65.Module = 
prim::GetAttr[name="fc"](%self.1)
   >   %1449 : __torch__.torch.nn.modules.module.___torch_mangle_64.Module = 
prim::GetAttr[name="avgpool"](%self.1)
   >   %1448 : __torch__.torch.nn.modules.module.___torch_mangle_63.Module = 
prim::GetAttr[name="layer4"](%self.1)
   >   %1402 : __torch__.torch.nn.modules.module.___torch_mangle_47.Module = 
prim::GetAttr[name="layer3"](%self.1)
   >   %1356 : __torch__.torch.nn.modules.module.___torch_mangle_31.Module = 
prim::GetAttr[name="layer2"](%self.1)
   >   %1310 : __torch__.torch.nn.modules.module.___torch_mangle_15.Module = 
prim::GetAttr[name="layer1"](%self.1)
   >   %1273 : __torch__.torch.nn.modules.module.___torch_mangle_2.Module = 
prim::GetAttr[name="maxpool"](%self.1)
   >   %1272 : __torch__.torch.nn.modules.module.___torch_mangle_1.Module = 
prim::GetAttr[name="relu"](%self.1)
   >   %1271 : __torch__.torch.nn.modules.module.___torch_mangle_0.Module = 
prim::GetAttr[name="bn1"](%self.1)
   >   %1265 : __torch__.torch.nn.modules.module.Module = 
prim::GetAttr[name="conv1"](%self.1)
   >   %1528 : Tensor = prim::CallMethod[name="forward"](%1265, %input.1)
   >   %1529 : Tensor = prim::CallMethod[name="forward"](%1271, %1528)
   >   %1530 : Tensor = prim::CallMethod[name="forward"](%1272, %1529)
   >   %1531 : Tensor = prim::CallMethod[name="forward"](%1273, %1530)
   >   %1532 : Tensor = prim::CallMethod[name="forward"](%1310, %1531)
   >   %1533 : Tensor = prim::CallMethod[name="forward"](%1356, %1532)
   >   %1534 : Tensor = prim::CallMethod[name="forward"](%1402, %1533)
   >   %1535 : Tensor = prim::CallMethod[name="forward"](%1448, %1534)
   >   %1536 : Tensor = prim::CallMethod[name="forward"](%1449, %1535)
   >   %1182 : int = prim::Constant[value=1]() # 
/home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0
   >   %1183 : int = prim::Constant[value=-1]() # 
/home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0
   >   %input : Float(1, 512) = aten::flatten(%1536, %1182, %1183) # 
/home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0
   >   %1537 : Tensor = prim::CallMethod[name="forward"](%1452, %input)
   >   return (%1537)
   > ```
   
   I can take a look again in the next few days. Will probably move to at least 
support PT 1.4 which was just released (or sometime in the next few days) and 
may have those IR changes as well. Some to-do's remaining are making sure 
different types are working, cleaning up the tests based off @jwfromm comments, 
and updating the parser to work for 1.4>. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-08 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-572213105
 
 
   > @alexwong it seems CI is stuck after failing resnet test?
   
   Yes, also some operator unit tests failed (batch_norm and dense). Slightly 
hard to debug as I can't seem to reproduce locally atm. Will see what is 
happening.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser

2020-01-06 Thread GitBox
alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser
URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-571368264
 
 
   Thanks for the review @jwfromm and sorry for the delay! Was busy with some 
other things and then the holidays happened!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services