trevor-m opened a new pull request #6150: URL: https://github.com/apache/incubator-tvm/pull/6150
Currently, any extra attributes of the llvm target are not being passed through to LLVMModule during `ModulePackImportsToLLVM`. This will cause compilation to fail with `error: /tmp/tmpctakbpk1/devc.o uses VFP register arguments, output does not` when the target is opencl and the target_host requires `-mfloat-abi=soft`. There are two reasons why: 1. `python/tvm/runtime/module.py` uses "_get_target_triple" to get the target triple to use for `ModulePackImportsToLLVM`. However, the implementation of that function uses `tm_->getTargetTriple().str()` which will only have the standard `<arch><sub>-<vendor>-<sys>-<abi>` and is missing any extra flags or attributes such as `-mfloat-abi=soft`. 2. Since the module was created by CodegenBlob, it doesn't store anything in the "tvm_target" module metadata and therefore `module_->getTargetTriple()` is called which again doesn't have the extra flags and attributes: https://github.com/apache/incubator-tvm/blob/master/src/target/llvm/llvm_module.cc#L254 This PR fixes those problems by: 1. Adding important attributes back into `_get_target_triple` (only mfloat-abi for now). This part doesn't seem ideal to me. Any thoughts on how to improve? Can we store the user's full target string earlier on? 2. Saving full target triple string in "tvm_target" module metadata flag during `CodeGenBlob` so that `LLVMModuleNode::Init(std::unique_ptr<llvm::Module> module, std::shared_ptr<llvm::LLVMContext> ctx)` can retrieve it: https://github.com/apache/incubator-tvm/blob/master/src/target/llvm/llvm_module.cc#L247 This matches what `LLVMModuleNode::Init(const IRModule& mod, std::string target)` already does. Discuss post: https://discuss.tvm.ai/t/opencl-target-for-32-bit-arm-linux-android-broken-after-pr-4657/7252 Fixes https://github.com/apache/incubator-tvm/issues/6019 ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
