electriclilies commented on a change in pull request #8886:
URL: https://github.com/apache/tvm/pull/8886#discussion_r701389234



##########
File path: src/relay/backend/graph_executor_codegen.cc
##########
@@ -236,9 +236,13 @@ class GraphExecutorCodegen : public 
backend::MemoizedExprTranslator<std::vector<
           tec::UpdateFunctionMetadata(func, this->function_metadata_);
         })(mod);
 
-    tec::LoweredModule lowered_module = tec::IRModuleToLoweredModule(new_mod);
-    function_metadata_.Set(runtime::symbol::tvm_module_main, 
lowered_module.main_func_info);
-    auto main_module = lowered_module.main_module;
+    Optional<backend::FunctionInfo> main_func_info =
+        lowered_mod->GetAttr<backend::FunctionInfo>("main_func_info");
+    ICHECK(main_func_info) << "The attribute \"main_func_info\" should be set 
at this point.";
+    function_metadata_.Set(runtime::symbol::tvm_module_main, 
main_func_info.value());
+
+    // Get only the Relay functions out of the lowered module so we can run 
type inference on them
+    IRModule main_module = tec::GetMainModule(lowered_mod);

Review comment:
       Actually there's a slight problem with this -- you can't run InferType 
on the whole `lowered_mod` because`lowered_mod` has functions with GlobalVars 
that are in the IRModule's global_var_map, but you can't find them using Lookup 
because they are not the same object (it fails on line 210).. 
   
https://github.com/apache/tvm/blob/main/src/relay/transforms/type_infer.cc#L209:L215
   My workaround was to only apply type inferencing to the `main_module` (since 
that was what was done before this PR).
   
   I'm not sure if this is a bug in how the type inferencer is dealing with 
GlobalVars (maybe it should be doing lookup by name hint, not pointer equality) 
or if it's a bug in how those GlobalVars are being made / propogated

##########
File path: src/relay/backend/graph_executor_codegen.cc
##########
@@ -236,9 +236,13 @@ class GraphExecutorCodegen : public 
backend::MemoizedExprTranslator<std::vector<
           tec::UpdateFunctionMetadata(func, this->function_metadata_);
         })(mod);
 
-    tec::LoweredModule lowered_module = tec::IRModuleToLoweredModule(new_mod);
-    function_metadata_.Set(runtime::symbol::tvm_module_main, 
lowered_module.main_func_info);
-    auto main_module = lowered_module.main_module;
+    Optional<backend::FunctionInfo> main_func_info =
+        lowered_mod->GetAttr<backend::FunctionInfo>("main_func_info");
+    ICHECK(main_func_info) << "The attribute \"main_func_info\" should be set 
at this point.";
+    function_metadata_.Set(runtime::symbol::tvm_module_main, 
main_func_info.value());
+
+    // Get only the Relay functions out of the lowered module so we can run 
type inference on them
+    IRModule main_module = tec::GetMainModule(lowered_mod);

Review comment:
       Actually there's a slight problem with this -- you can't run InferType 
on the whole `lowered_mod` because`lowered_mod` has functions with GlobalVars 
that are in the IRModule's global_var_map, but you can't find them using Lookup 
because they are not the same object (it fails on line 210).. 
   
https://github.com/apache/tvm/blob/main/src/relay/transforms/type_infer.cc#L209:L215
   My workaround was to only apply type inferencing to the `main_module` (since 
that was what was done before this PR).
   
   I'm not sure if this is a bug in how the type inferencer is dealing with 
GlobalVars (maybe it should be doing lookup by name hint, not pointer equality) 
or if it's a bug in how those GlobalVars are being made / propagated
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to