gessha commented on issue #12567:
URL: https://github.com/apache/tvm/issues/12567#issuecomment-1703033730

   I was trying to reproduce the bug but I got stuck. Do you know what I did 
wrong?
   
   I built the tvm.ti_cpu container using 
   `~/projects/tvm$ ./docker/build.sh ci_cpu`
   
   As per @leandron's instructions I tried installing onnx but it seems like it 
was unnecessary is it's already installed in the last lines of the tvm.ci_cpu 
Docker file:
   
https://github.com/apache/tvm/blob/022299b51fa71c42f180b0c8e3afcac4eb50d71d/docker/Dockerfile.ci_cpu#L149-L151
   
   I enter the container using `./docker/bash.sh tvm.ci_cpu -it bash`:
   
   Then I tried running the test
   `pytest ./tests/python/contrib/test_onnx.py::test_resize`
   
   but I got the following error:
   
   ```
   INTERNALERROR> Traceback (most recent call last):
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/_pytest/main.py", line 266, 
in wrap_session
   INTERNALERROR>     config._do_configure()
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/_pytest/config/__init__.py",
 line 1054, in _do_configure
   INTERNALERROR>     
self.hook.pytest_configure.call_historic(kwargs=dict(config=self))
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/pluggy/_hooks.py", line 
514, in call_historic
   INTERNALERROR>     res = self._hookexec(self.name, self._hookimpls, kwargs, 
False)
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/pluggy/_manager.py", line 
115, in _hookexec
   INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, 
firstresult)
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/pluggy/_callers.py", line 
113, in _multicall
   INTERNALERROR>     raise exception.with_traceback(exception.__traceback__)
   INTERNALERROR>   File 
"/venv/apache-tvm-py3.8/lib/python3.8/site-packages/pluggy/_callers.py", line 
77, in _multicall
   INTERNALERROR>     res = hook_impl.function(*args)
   INTERNALERROR>   File 
"/home/georgi/projects/tvm/python/tvm/testing/plugin.py", line 70, in 
pytest_configure
   INTERNALERROR>     print("enabled targets:", "; ".join(map(lambda x: x[0], 
utils.enabled_targets())))
   INTERNALERROR>   File 
"/home/georgi/projects/tvm/python/tvm/testing/utils.py", line 526, in 
enabled_targets
   INTERNALERROR>     return [(t["target"], tvm.device(t["target"])) for t in 
_get_targets() if t["is_runnable"]]
   INTERNALERROR>   File 
"/home/georgi/projects/tvm/python/tvm/testing/utils.py", line 445, in 
_get_targets
   INTERNALERROR>     raise TVMError(
   INTERNALERROR> tvm._ffi.base.TVMError: None of the following targets are 
supported by this build of TVM: ['llvm', 'cuda', 'nvptx', 'vulkan 
-from_device=0', 'opencl', 'opencl -device=mali,aocl_sw_emu', 'opencl 
-device=intel_graphics', 'metal', 'rocm', 'hexagon']. Try setting 
TVM_TEST_TARGETS to a supported target. Cannot default to llvm, as it is not 
enabled.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to