leandron opened a new issue #8092:
URL: https://github.com/apache/tvm/issues/8092


   Since #7972 was merged, it is impossible to run tests without having 
`USE_VERILATOR ON`.
   
   I understand this is enabled in our upstream CI, but I think that It would 
be good to have tests dependant on verilator support to be skipped, if I'm 
testing a TVM build without verilator, such as we have for many other tests.
   
   The error I see, in case you run tests without Verilator support is:
   ```
   =================================== FAILURES 
===================================
   ________________________________ test_mobilenet 
________________________________
   
       def test_mobilenet():
           """Mobilenet tests."""
   >       tmobilenet(4)
   
   tests/python/contrib/test_verilator/test_mobilenet.py:239: 
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ 
   tests/python/contrib/test_verilator/test_mobilenet.py:231: in tmobilenet
       res = run_model(mod, params, opts)
   tests/python/contrib/test_verilator/test_mobilenet.py:152: in run_model
       with transform.PassContext(opt_level=3, 
config={"relay.ext.verilator.options": opts}):
   python/tvm/ir/transform.py:85: in __init__
       _ffi_transform_api.PassContext, opt_level, required, disabled, trace, 
config
   python/tvm/_ffi/_ctypes/object.py:136: in __init_handle_by_constructor__
       handle = __init_by_constructor__(fconstructor, args)
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ 
   
   fconstructor = <tvm.runtime.packed_func.PackedFunc object at 0x7f1e3b526438>
   args = (3, [], [], None, {'relay.ext.verilator.options': {'lib_path': 
'/workspace/tests/python/contrib/test_verilator/../../../../3rdparty/vta-hw/apps/verilator/add/libverilator_4.so',
 'profiler_cycle_counter_id': 0, 'profiler_enable': True}})
   
       def __init_handle_by_constructor__(fconstructor, args):
           """Initialize handle by constructor"""
           temp_args = []
           values, tcodes, num_args = _make_tvm_args(args, temp_args)
           ret_val = TVMValue()
           ret_tcode = ctypes.c_int()
           if (
               _LIB.TVMFuncCall(
                   fconstructor.handle,
                   values,
                   tcodes,
                   ctypes.c_int(num_args),
                   ctypes.byref(ret_val),
                   ctypes.byref(ret_tcode),
               )
               != 0
           ):
   >           raise get_last_ffi_error()
   E           AttributeError: Traceback (most recent call last):
   E             12: TVMFuncCall
   E                   at /workspace/src/runtime/c_runtime_api.cc:474
   E             11: 
tvm::runtime::PackedFunc::CallPacked(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*) const
   E                   at /workspace/include/tvm/runtime/packed_func.h:1150
   E             10: std::function<void (tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)>::operator()(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*) const
   E                   at /usr/include/c++/7/bits/std_function.h:706
   E             9: operator()
   E                   at /workspace/include/tvm/runtime/packed_func.h:1479
   E             8: unpack_call<tvm::transform::PassContext, 5, 
tvm::transform::<lambda(int, tvm::runtime::Array<tvm::runtime::String>, 
tvm::runtime::Array<tvm::runtime::String>, tvm::transform::TraceFunc, 
tvm::runtime::Optional<tvm::runtime::Map<tvm::runtime::String, 
tvm::runtime::ObjectRef> >)> >
   E                   at /workspace/include/tvm/runtime/packed_func.h:1420
   E             7: run<>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1381
   E             6: run<tvm::runtime::TVMMovableArgValueWithContext_>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1381
   E             5: run<tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1381
   E             4: run<tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1381
   E             3: run<tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1381
   E             2: run<tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_, 
tvm::runtime::TVMMovableArgValueWithContext_>
   E                   at /workspace/include/tvm/runtime/packed_func.h:1396
   E             1: operator()
   E                   at /workspace/src/ir/transform.cc:634
   E             0: 
tvm::transform::PassConfigManager::Legalize(tvm::runtime::Map<tvm::runtime::String,
 tvm::runtime::ObjectRef, void, void>*)
   E                   at /workspace/src/ir/transform.cc:125
   E             File "/workspace/src/ir/transform.cc", line 125
   E           AttributeError: Invalid config option 
'relay.ext.verilator.options' candidates are: 
relay.ext.vitis_ai.options.load_runtime_module 
,relay.ext.vitis_ai.options.export_runtime_module 
,relay.ext.vitis_ai.options.work_dir ,relay.ext.vitis_ai.options.build_dir 
,relay.ext.vitis_ai.options.target ,tir.detect_global_barrier 
,tir.InjectDoubleBuffer ,tir.HoistIfThenElse ,relay.FuseOps.max_depth 
,tir.instrument_bound_checkers ,tir.disable_vectorize ,tir.add_lower_pass 
,tir.noalias ,tir.UnrollLoop ,relay.backend.use_auto_scheduler 
,relay.backend.disable_compile_engine_cache ,tir.LoopPartition 
,relay.fallback_device_type ,tir.disable_assert ,relay.ext.vitis_ai.options 
,relay.ext.ethos-n.options
   ```
   
   cc @vegaluisjose @tmoreau89 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to