electriclilies commented on a change in pull request #8886:
URL: https://github.com/apache/tvm/pull/8886#discussion_r701391105



##########
File path: src/relay/backend/graph_executor_codegen.cc
##########
@@ -236,9 +236,13 @@ class GraphExecutorCodegen : public 
backend::MemoizedExprTranslator<std::vector<
           tec::UpdateFunctionMetadata(func, this->function_metadata_);
         })(mod);
 
-    tec::LoweredModule lowered_module = tec::IRModuleToLoweredModule(new_mod);
-    function_metadata_.Set(runtime::symbol::tvm_module_main, 
lowered_module.main_func_info);
-    auto main_module = lowered_module.main_module;
+    Optional<backend::FunctionInfo> main_func_info =
+        lowered_mod->GetAttr<backend::FunctionInfo>("main_func_info");
+    ICHECK(main_func_info) << "The attribute \"main_func_info\" should be set 
at this point.";
+    function_metadata_.Set(runtime::symbol::tvm_module_main, 
main_func_info.value());
+
+    // Get only the Relay functions out of the lowered module so we can run 
type inference on them
+    IRModule main_module = tec::GetMainModule(lowered_mod);

Review comment:
       Then the other problem is even if you do successfully manage to lookup 
the function, there are PrimFuncs in the module which the type inferencer 
doesn't know how to deal with. We could just skip PrimFuncs we find during type 
inference..




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to