alexwong commented on issue #4497: [WIP] [Relay] Add a PyTorch to Relay Parser URL: https://github.com/apache/incubator-tvm/pull/4497#issuecomment-574383474 > @alexwong I tried your PR locally. With pytorch v1.3 it works, but they introduced big change in [#28408](https://github.com/pytorch/pytorch/pull/28408) and [#28409](https://github.com/pytorch/pytorch/pull/28409), and it broke your PR (I may be wrong about which PR broke it). Below is how their IR looks like for resnet18 now. My pytorch version is '1.5.0a0+0dbd5c0' (the output of torch.**version**). > > ``` > graph(%self.1 : __torch__.torch.nn.modules.module.___torch_mangle_66.Module, > %input.1 : Float(1, 3, 224, 224)): > %1452 : __torch__.torch.nn.modules.module.___torch_mangle_65.Module = prim::GetAttr[name="fc"](%self.1) > %1449 : __torch__.torch.nn.modules.module.___torch_mangle_64.Module = prim::GetAttr[name="avgpool"](%self.1) > %1448 : __torch__.torch.nn.modules.module.___torch_mangle_63.Module = prim::GetAttr[name="layer4"](%self.1) > %1402 : __torch__.torch.nn.modules.module.___torch_mangle_47.Module = prim::GetAttr[name="layer3"](%self.1) > %1356 : __torch__.torch.nn.modules.module.___torch_mangle_31.Module = prim::GetAttr[name="layer2"](%self.1) > %1310 : __torch__.torch.nn.modules.module.___torch_mangle_15.Module = prim::GetAttr[name="layer1"](%self.1) > %1273 : __torch__.torch.nn.modules.module.___torch_mangle_2.Module = prim::GetAttr[name="maxpool"](%self.1) > %1272 : __torch__.torch.nn.modules.module.___torch_mangle_1.Module = prim::GetAttr[name="relu"](%self.1) > %1271 : __torch__.torch.nn.modules.module.___torch_mangle_0.Module = prim::GetAttr[name="bn1"](%self.1) > %1265 : __torch__.torch.nn.modules.module.Module = prim::GetAttr[name="conv1"](%self.1) > %1528 : Tensor = prim::CallMethod[name="forward"](%1265, %input.1) > %1529 : Tensor = prim::CallMethod[name="forward"](%1271, %1528) > %1530 : Tensor = prim::CallMethod[name="forward"](%1272, %1529) > %1531 : Tensor = prim::CallMethod[name="forward"](%1273, %1530) > %1532 : Tensor = prim::CallMethod[name="forward"](%1310, %1531) > %1533 : Tensor = prim::CallMethod[name="forward"](%1356, %1532) > %1534 : Tensor = prim::CallMethod[name="forward"](%1402, %1533) > %1535 : Tensor = prim::CallMethod[name="forward"](%1448, %1534) > %1536 : Tensor = prim::CallMethod[name="forward"](%1449, %1535) > %1182 : int = prim::Constant[value=1]() # /home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0 > %1183 : int = prim::Constant[value=-1]() # /home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0 > %input : Float(1, 512) = aten::flatten(%1536, %1182, %1183) # /home/masa/anaconda3/lib/python3.7/site-packages/torchvision-0.5.0a0+07cbb46-py3.7-linux-x86_64.egg/torchvision/models/resnet.py:210:0 > %1537 : Tensor = prim::CallMethod[name="forward"](%1452, %input) > return (%1537) > ``` I can take a look again in the next few days. Will probably move to at least support PT 1.4 which was just released (or sometime in the next few days) and may have those IR changes as well. Some to-do's remaining are making sure different types are working, cleaning up the tests based off @jwfromm comments, and updating the parser to work for 1.4>.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
