SubjectNoi opened a new issue #7650:
URL: https://github.com/apache/tvm/issues/7650


   TVM Version: 0.8
   OS: Ubuntu 18.04
   
   I encountered a problem when I use `relay.build` to build the context from 
IR Module **AFTER I SET ATTRIBUTE OF THE IR MODULE** by `mod['main'] = 
mod['main'].with_attr("Attr_Key", "Attr_Value")` (Since I want to use my own 
CodeGen by using `mod['main'] = mod['main'].with_attr("compiler", 
"my_code_gen")` ), the onnx model I use is bertsquad-8.onnx with dynamic shape 
downloaded at: 
   
   Following script can reproduce my problem:
   
   ```
    1: from tvm import relay
    2: import tvm
    3: from tvm.contrib.debugger import debug_runtime as graph_runtime
    4: import onnx
    5:
    6: model = onnx.load("bertsquad-8.onnx")
    7: target = 'c'
    8: shape_dict = { 
    9:     "unique_ids_raw_output___9:0" : (1,),
   10:     "input_ids:0": (1, 256),
   11:     "input_mask:0": (1, 256),
   12:     "segment_ids:0": (1, 256)
   13: }
   14: mod, params = relay.frontend.from_onnx(model, shape_dict)
   15: mod['main'] = mod['main'].with_attr("Compiler", "My_Code_Gen") # Without 
this line, the program run normally
   16: mod = relay.transform.InferType()(mod)
   17: json, lib, _ = relay.build(mod, target=target, params=params)
   ```
   
   Once I add Line 15, I encounter following error:
   ```
   Traceback (most recent call last):
     File "run.py", line 59, in <module>
       json, lib, _ = relay.build(mod, target=target, params=params)
     File "/data/tvm/python/tvm/relay/build_module.py", line 276, in build
       graph_json, mod, params = bld_mod.build(mod, target, target_host, params)
     File "/data/tvm/python/tvm/relay/build_module.py", line 139, in build
       self._build(mod, target, target_host)
     File "/data/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in 
__call__
       raise get_last_ffi_error()
   tvm._ffi.base.TVMError: Traceback (most recent call last):
     [bt] (8) /data/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void 
(tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x6b) 
[0x7fafedc18d9b]
     [bt] (7) 
/data/tvm/build/libtvm.so(tvm::relay::StorageAllocator::VisitExpr_(tvm::relay::CallNode
 const*)+0xbe) [0x7fafedbc2c9e]
     [bt] (6) 
/data/tvm/build/libtvm.so(tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr 
const&)+0x7b) [0x7fafedc6b03b]
     [bt] (5) /data/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void 
(tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x6b) 
[0x7fafedc18d9b]
     [bt] (4) 
/data/tvm/build/libtvm.so(tvm::relay::StorageAllocator::VisitExpr_(tvm::relay::CallNode
 const*)+0x1b5) [0x7fafedbc2d95]
     [bt] (3) 
/data/tvm/build/libtvm.so(tvm::relay::StorageAllocator::CreateToken(tvm::RelayExprNode
 const*, bool)+0x17d) [0x7fafedbc28cd]
     [bt] (2) 
/data/tvm/build/libtvm.so(tvm::relay::StorageAllocator::Request(tvm::relay::StorageToken*)+0x30)
 [0x7fafedbc1ae0]
     [bt] (1) 
/data/tvm/build/libtvm.so(tvm::relay::StorageAllocator::GetMemorySize(tvm::relay::StorageToken*)+0x296)
 [0x7fafedbc15f6]
     [bt] (0) /data/tvm/build/libtvm.so(+0x2f67928) [0x7fafedbbf928]
     File "/data/tvm/src/relay/backend/graph_plan_memory.cc", line 292
   TVMError: 
   ---------------------------------------------------------------
   An internal invariant was violated during the execution of TVM.
   Please read TVM's error reporting guidelines.
   More details can be found here: 
https://discuss.tvm.ai/t/error-reporting/7793.
   ---------------------------------------------------------------
     Check failed: pval != nullptr == false: Cannot allocate memory symbolic 
tensor shape [?]
   ```
   **But once I delete Line 15, the program act normally, I wonder how to 
resolve this.** 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to