leonwanghui opened a new issue #5771: URL: https://github.com/apache/incubator-tvm/issues/5771
Hi @tqchen @jroesch , I was working for embedding Rust runtime for WASM target, but came across some problems when building test-wasm32 crate. ## Error elaboration Firstly I need to clarify that I can build and run the [test-wasm32](https://github.com/apache/incubator-tvm/tree/master/rust/runtime/tests/test_wasm32) package successfully, but what I wanted is to build the `lib` rather than `bin` type of package, so I changed several lines in original `main.rs` and copy it to `lib.rs`: ```rust extern "C" { static __tvm_module_ctx: i32; } #[no_mangle] unsafe fn __get_tvm_module_ctx() -> i32 { // Refer a symbol in the libtest_wasm32.a to make sure that the link of the // library is not optimized out. __tvm_module_ctx } extern crate ndarray; #[macro_use] extern crate tvm_runtime; use ndarray::Array; use tvm_runtime::{DLTensor, Module as _, SystemLibModule}; #[no_mangle] pub extern "C" fn run() { // try static let mut a = Array::from_vec(vec![1f32, 2., 3., 4.]); let mut b = Array::from_vec(vec![1f32, 0., 1., 0.]); let mut c = Array::from_vec(vec![0f32; 4]); let e = Array::from_vec(vec![2f32, 2., 4., 4.]); let mut a_dl: DLTensor = (&mut a).into(); let mut b_dl: DLTensor = (&mut b).into(); let mut c_dl: DLTensor = (&mut c).into(); let syslib = SystemLibModule::default(); let add = syslib .get_function("default_function") .expect("main function not found"); call_packed!(add, &mut a_dl, &mut b_dl, &mut c_dl).unwrap(); println!("{:?}", c_dl.data); assert!(c.all_close(&e, 1e-8f32)); } ``` But when I tried to compile the library, some errors occurred: ``` undefined symbol: __tvm_module_ctx ``` I was guessing something required to modified when changing the type from `bin` to `lib`, but I have no idea what to do about it because of missing the context, would be appreciated it if anyone could take a look. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
