SiriusNEO commented on PR #14606:
URL: https://github.com/apache/tvm/pull/14606#issuecomment-1506289428

   @yongwww Sure! For the example above, if you run it you will get error in 
this line:
   ```
   ex = relax.build(lowered_mod, "llvm")
   ```
   And the error msg is:
   ```
           at 
/home/sirius/catalyst/mlc-ai/src/relax/backend/vm/vm_shape_lower.cc:366
     File 
"/home/sirius/catalyst/mlc-ai/src/relax/backend/vm/vm_shape_lower.cc", line 366
   TVMError:
   ---------------------------------------------------------------
   An error occurred during the execution of TVM.
   For more information, please see: https://tvm.apache.org/docs/errors.html
   ---------------------------------------------------------------
     Check failed: (slot->value_computed) is false: PrimExpr T.int64(4) * 
batch_size * T.int64(10) has not been computed
   ```
   But if you remove a function, either `main` or `main1` from the module 
`Test`, it goes well.
   Another trial is that I clear `slot_vec_` and `slot_map_` in the beginning 
of `Function Rewrite(GlobalVar gvar, Function func)` in the vm shape lower, and 
then it also works for this example.
   So after digging it a bit, I found the problem lies in the dup tir var in 
multiple functions (For more details I can't figure it out completely, since 
I'm not quite familiar with the logic of `VMShapeLower`). And after 
communicating with TQ, this behaviour (Use the same TIR var in multiple 
functions) is considered illegal. So I send this PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to