yongwww commented on PR #16531: URL: https://github.com/apache/tvm/pull/16531#issuecomment-1970344968
Thank you for the proposal! With the introduction of the SLM, we now are able to utilize the TVM nn.module for supporting models created with Torch, which has been working well afaik. Not sure if the SLM works for the cases you are working on. Here is the llama model implemented via SLM ([the SLM llama model](https://github.com/mlc-ai/mlc-llm/blob/main/python/mlc_chat/model/llama/llama_model.py)). Looks the Torch FX is the underlying representation of the exported graph, I am wondering if it is possible to use/update the existing FX translator (relax/frontend/torch/fx_translator.py) to support the `ExportedProgram`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
