nolanliou opened a new issue #6268:
URL: https://github.com/apache/incubator-tvm/issues/6268


   When I convert a PyTorch Bert model to TVM, there is a problem blow
   ```
     File "tvm/python/tvm/relay/frontend/pytorch.py", line 2673, in from_pytorch
       default_dtype=default_dtype)
     File "/tvm/python/tvm/relay/frontend/pytorch.py", line 2580, in 
convert_operators
       relay_out = relay_op(inputs, _get_input_types(op_node, 
default_dtype=default_dtype))
     File "tvm/python/tvm/relay/frontend/pytorch.py", line 255, in _impl
       inferred_shape = _infer_shape(data)
     File "tvm/python/tvm/relay/frontend/common.py", line 486, in infer_shape
       out_type = infer_type(inputs, mod=mod)
     File "tvm/python/tvm/relay/frontend/common.py", line 465, in infer_type
       new_mod = IRModule.from_expr(node)
     File "tvm/python/tvm/ir/module.py", line 222, in from_expr
       return _ffi_api.Module_FromExpr(expr, funcs, defs)
     File "tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 225, in __call__
       raise get_last_ffi_error()
   tvm._ffi.base.TVMError: Traceback (most recent call last):
     [bt] (8) 
tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr_(tvm::relay::CallNode 
const*)+0x13c) [0x7f21a6454bdc]
     [bt] (7) 
tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr 
const&)+0x71) [0x7f21a6457711]
     [bt] (6) tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr 
const&)>::VisitExpr(tvm::RelayExpr const&)+0x5b) [0x7f21a64153eb]
     [bt] (5) 
tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr_(tvm::relay::CallNode 
const*)+0x13c) [0x7f21a6454bdc]
     [bt] (4) 
tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr 
const&)+0x71) [0x7f21a6457711]
     [bt] (3) tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr 
const&)>::VisitExpr(tvm::RelayExpr const&)+0x5b) [0x7f21a64153eb]
     [bt] (2) 
tvm/build/libtvm.so(tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::ConstructorNode
 const*)+0x35) [0x7f21a6266505]
     [bt] (1) 
tvm/build/libtvm.so(tvm::IRModuleNode::LookupTypeDef(tvm::GlobalTypeVar const&) 
const+0x14a) [0x7f21a5c34a4a]
     [bt] (0) tvm/build/libtvm.so(+0xdc19e7) [0x7f21a5c329e7]
     File "tvm/src/ir/module.cc", line 299
   TVMError: Check failed: it != type_definitions.end(): There is no definition 
of static_tensor_float32_1_4_768_t
   ```
   
   Part of graph code list below:
   ```
     _234 = torch.transpose(torch.matmul(scores22, v10), 1, 2)
     h10 = torch.contiguous(_234, memory_format=0)
     _235 = ops.prim.NumToTensor(torch.size(h10, 0))
     _236 = ops.prim.NumToTensor(torch.size(h10, 1))
     input114 = torch.view(h10, [int(_235), int(_236), 768])
     output68 = torch.matmul(input114, torch.t(CONSTANTS.c143))
     input115 = torch.add_(output68, CONSTANTS.c150, alpha=1)
     _237 = torch.dropout(input115, 0.10000000000000001, False)
     input116 = torch.add(input111, _237, alpha=1)
     input117 = torch.layer_norm(input116, [768], CONSTANTS.c3, CONSTANTS.c4, 
9.9999999999999998e-13, True)
     output69 = torch.matmul(input117, torch.t(CONSTANTS.c151))
     x94 = torch.add_(output69, CONSTANTS.c152, alpha=1)
     _238 = torch.mul(torch.pow(x94, 3), CONSTANTS.c17)
     _239 = torch.mul(torch.add(x94, _238, alpha=1), CONSTANTS.c18)
     _240 = torch.mul(x94, CONSTANTS.c19)
     _241 = torch.add(torch.tanh(_239), CONSTANTS.c20, alpha=1)
     input118 = torch.mul(_240, _241)
     output70 = torch.matmul(input118, torch.t(CONSTANTS.c153))
     input119 = torch.add_(output70, CONSTANTS.c154, alpha=1)
     _242 = torch.dropout(input119, 0.10000000000000001, False)
     input120 = torch.add(input117, _242, alpha=1)
     encoder_out = torch.layer_norm(input120, [768], CONSTANTS.c3, 
CONSTANTS.c4, 9.9999999999999998e-13, True)
     _243 = torch.slice(encoder_out, 0, 0, 9223372036854775807, 1)
     input121 = torch.select(_243, 1, 0)
     input122 = torch.addmm(CONSTANTS.c156, input121, torch.t(CONSTANTS.c155), 
beta=1, alpha=1)
     input123 = torch.tanh(input122)
     input124 = torch.dropout(input123, 0.10000000000000001, False)
     input125 = torch.addmm(CONSTANTS.c158, input124, torch.t(CONSTANTS.c157), 
beta=1, alpha=1)
     r = torch.softmax(input125, 1, None)
     return torch.to(r, 6, False, False)
   ```
   
   It failed when converting `slice` OP.
   
   Has anybody seen this question before?
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to