ganler edited a comment on issue #10651:
URL: https://github.com/apache/tvm/issues/10651#issuecomment-1070442633


   > What happens if you use the PT frontend?
   
   It seems that cannot work as well (but i am not very familiar with 
`from_pytorch` API so I might make mistake in the following code).
   
   ```python
   import torch
   
   class Net(torch.nn.Module):
       def __init__(self) -> None:
           super().__init__()
           self.weight = torch.nn.Parameter(torch.randn(3, 1))
   
       def forward(self, x):
           return torch.matmul(x, self.weight)
   
   
   net = Net().eval()
   
   import tvm
   from tvm import relay
   from tvm.relay.frontend import from_pytorch
   
   scripted_model = torch.jit.trace(net, i).eval()
   mod = from_pytorch(scripted_model, [("x", (3,))])
   
   with tvm.transform.PassContext(opt_level=4):
       relay.build_module.create_executor("graph", mod, tvm.cpu(), 
target='llvm').evaluate()
   ```
   
   log
   
   ```
     File "test.py", line 87, in <module>
       relay.build_module.create_executor("graph", mod, tvm.cpu(), 
target='llvm').evaluate()
     File "/home/ganler/Documents/tvm/python/tvm/relay/backend/interpreter.py", 
line 171, in evaluate
       return self._make_executor()
     File "/home/ganler/Documents/tvm/python/tvm/relay/build_module.py", line 
618, in _make_executor
       self.mod = InferType()(self.mod)
     File "/home/ganler/Documents/tvm/python/tvm/ir/transform.py", line 161, in 
__call__
       return _ffi_transform_api.RunPass(self, mod)
     File "/home/ganler/Documents/tvm/python/tvm/_ffi/_ctypes/packed_func.py", 
line 237, in __call__
       raise get_last_ffi_error()
   tvm._ffi.base.TVMError: Traceback (most recent call last):
     2: TVMFuncCall
     1: 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
 (tvm::transform::Pass, 
tvm::IRModule)>::AssignTypedLambda<tvm::transform::$_6>(tvm::transform::$_6, 
std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> 
>)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> 
>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)
     0: tvm::runtime::TVMMovableArgValueWithContext_::operator 
tvm::IRModule<tvm::IRModule>() const
     4: TVMFuncCall
     3: 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
 (tvm::transform::Pass, 
tvm::IRModule)>::AssignTypedLambda<tvm::transform::$_6>(tvm::transform::$_6, 
std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> 
>)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> 
>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)
     2: tvm::runtime::TVMMovableArgValueWithContext_::operator 
tvm::IRModule<tvm::IRModule>() const
     1: tvm::runtime::TVMMovableArgValue_::operator 
tvm::IRModule<tvm::IRModule, void>() const
     0: tvm::IRModule tvm::runtime::TVMPODValue_::AsObjectRef<tvm::IRModule>() 
const
     File "/home/ganler/Documents/tvm/include/tvm/runtime/packed_func.h", line 
777
   TVMError: In function transform.RunPass(0: transform.Pass, 1: IRModule) -> 
IRModule: error while converting argument 1: [02:31:28] 
/home/ganler/Documents/tvm/include/tvm/runtime/packed_func.h:1863: 
   ---------------------------------------------------------------
   An error occurred during the execution of TVM.
   For more information, please see: https://tvm.apache.org/docs/errors.html
   ---------------------------------------------------------------
     Check failed: (!checked_type.defined()) is false: Expected IRModule, but 
got Array
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to