JieGH opened a new issue, #17508:
URL: https://github.com/apache/tvm/issues/17508

   
   ---
   
   ### Expected Behavior
   
   After building TVM 0.18.0 with LLVM 19.1.3, I expect TVM to generate RISC-V 
compatible code that executes without errors related to unsupported CPU types. 
The build should allow the execution of a basic TVM Python example on a Banana 
Pi K1 board, with the `riscv64-linux-gnu` target specified in the configuration.
   
   ### Actual Behavior
   
   Upon running a simple TVM example with LLVM 19.1.3 and TVM 0.18.0 on the 
Banana Pi K1, I encounter the following error message:
   
   ```bash
   Unsupported CPU type!
   UNREACHABLE executed at 
/home/jlei/llvm-project/llvm/lib/ExecutionEngine/RuntimeDyld/RuntimeDyldELF.cpp:1080!
   ```
   In the TVM logs, there is also a warning that native vector bits are set to 
128 for RISC-V, which could be relevant to the issue. The error persists 
despite multiple rebuilds of both LLVM and TVM, with adjusted configurations 
and target-specific flags to ensure compatibility with the RISC-V architecture 
on this board.
   
   The error appears to stem from LLVM’s RuntimeDyldELF.cpp file, and recent 
threads, such as [LLVM Issue 
#58652](https://github.com/llvm/llvm-project/issues/58652) and [Halide Issue 
#7078](https://github.com/halide/Halide/issues/7078), mention related problems 
that were resolved in newer LLVM releases, motivating my decision to upgrade 
from LLVM 15.0.7 to 19.1.3.
   
   Environment
   
        •       Operating System: Banana Pi K1 OS (version 1.X, latest)
        •       LLVM Version: 19.1.3 (Default target: riscv64-linux-gnu; Host 
CPU: generic-rv64)
        •       TVM Version: 0.18.0
        •       Target Triple Configuration in TVM: "llvm 
-mtriple=riscv64-linux-gnu -mcpu=generic-rv64"
        •       Architecture Flags: -march=rv64gc -mabi=lp64d
        •       Other Configuration Flags:
        •       USE_LLVM set to "llvm-config --ignore-libllvm --link-static"
        •       GPU backends like CUDA, Vulkan, and OpenCL disabled.
        •       Set USE_TVM_RUNTIME ON, USE_PROFILER ON, USE_GRAPH_RUNTIME ON.
        •       Profiling, graph runtime, and relevant libraries enabled; 
unnecessary libraries like MKL and NNPACK disabled.
        •       Builds attempted with both RelWithDebInfo and Release build 
types.
   
   Steps to Reproduce
   
        1.      Compile LLVM 19.1.3 with the following configurations:
        •       Ensure the riscv64-linux-gnu target is specified explicitly 
during the build.
        •       Build LLVM with optimized settings, assertions enabled, and set 
default and target-specific flags for RISC-V compatibility.
        2.      Configure and build TVM 0.18.0:
        •       Specify the target triple as riscv64-linux-gnu.
        •       Set architecture flags for -march=rv64gc -mabi=lp64d.
        •       Disable unnecessary backends and enable LLVM and 
RISC-V-specific configurations.
        •       Ensure no additional RISC-V flags are set in the LLVM 
configuration to isolate any unsupported flag issues.
        3.      Run a simple TVM Python example (like matrix multiplication or 
a basic compute test) on the Banana Pi K1 with the above setup to trigger the 
CPU error.
   
   Additional Notes and Troubleshooting
   
        •       I have attempted multiple builds of LLVM and TVM with minimal 
changes each time to pinpoint the issue.
        •       Cross-referencing with related issues, like [LLVM Issue 
#58652](https://github.com/llvm/llvm-project/issues/58652), suggests this might 
be linked to incomplete support for specific RISC-V targets or configurations.
        •       Despite the “Unsupported CPU” error, Python finishes the TVM 
script execution, but the generated LLVM code fails to execute.
        •       Notably, the error does not occur when using an older LLVM 
version (15.0.7), although it cannot produce LLVM code properly for the 
required RISC-V target.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to