Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BYOC][DNNL] TVMError: Unknown composite function:"dnnl.qnn.dense" and Check failed: const_var_ndarray_.count(var) > 0 (0 vs. 0) : ConstLoaderModuleNode is missing entry for constant 'tvmgen_default_dnnl_main_0_const_0' for function 'tvmgen_default_dnnl_main_0' #13222

Closed
yuwenjun1988 opened this issue Oct 28, 2022 · 1 comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@yuwenjun1988
Copy link

yuwenjun1988 commented Oct 28, 2022

1、Expected behavior
No Crash

2、Actual behavior
Crash Stack

(1)test_dnnl.py:1758:


test_dnnl.py:1414: in check_result
check_vm_result()
test_dnnl.py:1397: in check_vm_result
exe = relay.vm.compile(mod, target=target, params=params)
../../../python/tvm/relay/backend/vm.py:67: in compile
compiler.lower(mod, target, target_host)
../../../python/tvm/relay/backend/vm.py:126: in lower
self._lower(mod, raw_targets)


self = <tvm.runtime.packed_func.PackedFunc object at 0x7f3698105080>
args = (#[version = "0.0.5"]
def @main(%in_0: Tensor[(2, 10), uint8] /* ty=Tensor[(2, 10), uint8] */) -> Tensor[(2, 16), uint...AATUMAAIjBAAAOwwAAyEEAAPBBAABCwwAAEUMAANDBAAA/Qw=="
],
"attrs": {"tvm_version": "0.11.dev0"}
}, [llvm -keys=cpu ])
temp_args = [], values = <tvm._ffi._ctypes.packed_func.TVMValue_Array_2 object at 0x7f368e874ac0>, tcodes = <tvm._ffi._ctypes.packed_func.c_int_Array_2 object at 0x7f368e874cc0>

def __call__(self, *args):
    """Call the function with positional arguments

    args : list
       The positional arguments to the function call.
    """
    temp_args = []
    values, tcodes, num_args = _make_tvm_args(args, temp_args)
    ret_val = TVMValue()
    ret_tcode = ctypes.c_int()
    if (
        _LIB.TVMFuncCall(
            self.handle,
            values,
            tcodes,
            ctypes.c_int(num_args),
            ctypes.byref(ret_val),
            ctypes.byref(ret_tcode),
        )
        != 0
    ):
      raise get_last_ffi_error()

E tvm._ffi.base.TVMError: Traceback (most recent call last):
E 22: TVMFuncCall
E 21: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::relay::vm::VMCompiler::GetFunction(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
E 20: tvm::relay::vm::VMCompiler::Lower(tvm::IRModule, tvm::runtime::Array<tvm::Target, void> const&)
E 19: tvm::relay::vm::VMCompiler::LowerImpl(tvm::IRModule)
E 18: tvm::relay::vm::VMCompiler::OptimizeModuleImpl(tvm::IRModule)
E 17: tvm::transform::Pass::operator()(tvm::IRModule) const
E 16: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 15: tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 14: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 13: tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 12: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 11: tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
E 10: ZN3tvm7runtime13PackedFuncObj9ExtractorINS0_16PackedFuncSubObjIZNS0_15TypedPackedFuncIFNS_8IRModuleES5_NS_9transform11PassContextEEE17AssignTypedLambdaIZNS_5relay3tec7LowerTEENS0_6StringENS_17CompilationConfigESt8functionIFvNS_8BaseFuncEEEEUlS5_S7_E_EEvT_EUlRKNS0_7TVMArgsEPNS0_11TVMRetValueEE_EEE4CallEPKS1_SL_SP
E 9: tvm::relay::tec::LowerTE(tvm::IRModule const&, tvm::runtime::String const&, std::function<void (tvm::BaseFunc)>, tvm::CompilationConfig)
E 8: tvm::relay::tec::TECompilerImpl::LowerExternalFunctions()
E 7: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::runtime::ObjectRef const&)>::AssignTypedLambda<tvm::runtime::Module ()(tvm::runtime::ObjectRef const&)>(tvm::runtime::Module ()(tvm::runtime::ObjectRef const&), std::__cxx11::basic_string<char, std::char_traits, std::allocator >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
E 6: tvm::relay::contrib::DNNLCompiler(tvm::runtime::ObjectRef const&)
E 5: tvm::relay::contrib::DNNLModuleCodegen::CreateCSourceModule(tvm::runtime::ObjectRef const&)
E 4: tvm::relay::backend::MemoizedExprTranslator<std::vector<tvm::relay::contrib::Output, std::allocatortvm::relay::contrib::Output > >::VisitExpr(tvm::RelayExpr const&)
E 3: tvm::relay::ExprFunctor<std::vector<tvm::relay::contrib::Output, std::allocatortvm::relay::contrib::Output > (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)
E 2: ZZN3tvm5relay11ExprFunctorIFSt6vectorINS0_7contrib6OutputESaIS4_EERK
E 1: tvm::relay::contrib::CodegenDNNL::VisitExpr
(tvm::relay::CallNode const*)
E 0: tvm::relay::contrib::CodegenDNNL::GenerateCompositeFunctionCall(tvm::relay::FunctionNode const*, tvm::relay::CallNode const*)
E File "/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc", line 314
E TVMError: Unknown composite function:"dnnl.qnn.dense"

(2)test_dnnl.py:940:


test_dnnl.py:228: in run_and_verify_func
run_and_verify(
test_dnnl.py:196: in run_and_verify
func = relay.create_executor(
../../../python/tvm/relay/backend/interpreter.py:171: in evaluate
return self._make_executor()
../../../python/tvm/relay/build_module.py:519: in _make_executor
mod = build(self.mod, target=self.target)
../../../python/tvm/relay/build_module.py:364: in build
graph_json, runtime_mod, params = bld_mod.build(
../../../python/tvm/relay/build_module.py:161: in build
self._build(


self = <tvm.runtime.packed_func.PackedFunc object at 0x7f368e77a800>
args = (#[version = "0.0.5"]
def @main(%x: Tensor[(1, 16), float32] /* ty=Tensor[(1, 16), float32] */) -> Tensor[(1, 32), flo... ],
"attrs": {"tvm_version": "0.11.dev0"}
}, [llvm -keys=cpu ], None, graph{"link-params": (bool)0}, cpp, None, ...)
temp_args = [], values = <tvm._ffi._ctypes.packed_func.TVMValue_Array_8 object at 0x7f368e67f5c0>, tcodes = <tvm._ffi._ctypes.packed_func.c_int_Array_8 object at 0x7f368e67f7c0>

def __call__(self, *args):
    """Call the function with positional arguments

    args : list
       The positional arguments to the function call.
    """
    temp_args = []
    values, tcodes, num_args = _make_tvm_args(args, temp_args)
    ret_val = TVMValue()
    ret_tcode = ctypes.c_int()
    if (
        _LIB.TVMFuncCall(
            self.handle,
            values,
            tcodes,
            ctypes.c_int(num_args),
            ctypes.byref(ret_val),
            ctypes.byref(ret_tcode),
        )
        != 0
    ):
      raise get_last_ffi_error()

E tvm._ffi.base.TVMError: Traceback (most recent call last):
E 6: TVMFuncCall
E 5: tvm::relay::backend::RelayBuildModule::GetFunction(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#3}::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const
E 4: tvm::relay::backend::RelayBuildModule::Build(tvm::IRModule, tvm::runtime::Array<tvm::Target, void> const&, tvm::Target const&, tvm::relay::Executor const&, tvm::relay::Runtime const&, tvm::WorkspaceMemoryPools const&, tvm::ConstantMemoryPools const&, tvm::runtime::String)
E 3: tvm::relay::backend::RelayBuildModule::BuildRelay(tvm::IRModule, tvm::runtime::String const&)
E 2: tvm::codegen::CreateMetadataModule(std::unordered_map<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, tvm::runtime::NDArray, std::hash<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits, std::allocator > const, tvm::runtime::NDArray> > > const&, tvm::runtime::Module, tvm::runtime::Array<tvm::runtime::Module, void> const&, tvm::Target, tvm::relay::Runtime, tvm::relay::Executor, tvm::relay::backend::ExecutorCodegenMetadata)
E 1: ZN3tvm7runtime23ConstLoaderModuleCreateERKSt13unordered_mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEENS0_7NDArrayESt4hashIS7_ESt8equal_toIS7_ESaISt4pairIKS7_S8_EEERKS1_IS7_St6vectorIS7_SaIS7_EESA
E 0: tvm::runtime::ConstLoaderModuleNode::ConstLoaderModuleNode(std::unordered_map<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, tvm::runtime::NDArray, std::hash<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits, std::allocator > const, tvm::runtime::NDArray> > > const&, std::unordered_map<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::vector<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::allocator<std::__cxx11::basic_string<char, std::char_traits, std::allocator > > >, std::hash<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits, std::allocator > const, std::vector<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::allocator<std::_cxx11::basic_string<char, std::char_traits, std::allocator > > > > > > const&)
E File "/mnt/e/code/tvm/src/runtime/const_loader_module.cc", line 62
E TVMError:
E ---------------------------------------------------------------
E An error occurred during the execution of TVM.
E For more information, please see: https://tvm.apache.org/docs/errors.html
E ---------------------------------------------------------------
E
E Check failed: const_var_ndarray
.count(var) > 0 (0 vs. 0) : ConstLoaderModuleNode is missing entry for constant 'tvmgen_default_dnnl_main_0_const_0' for function 'tvmgen_default_dnnl_main_0

Environment
Ubuntu 18.04 TVM 0.11.dev0
Any environment details, such as: Operating System, TVM version, etc

2、Steps to reproduce

(1)run tvm/tests/python/contrib/test_dnnl.py

(2)
If in config.cmake set(USE_DNNL ON), it is successed.
But if in config.cmake set(USE_DNNL C_SRC), it is failed.
And if in config.cmake set(USE_DNNL C_SRC), compilation is also problematic and DNNL.cmake also is modified.
I modified the compilation problem and DNNL.cmake locally.
You can try set(USE_DNNL C_SRC),and run test_dnnl.py.

compile error message:
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc: In member function ‘tvm::relay::contrib::GenerateBodyOutput tvm::relay::contrib::CodegenDNNL::GenerateCompositeFunctionCall(const tvm::relay::FunctionNode*, const tvm::relay::CallNode*)’:
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc:279:99: error: call of overloaded ‘GetRootCall(const tvm::relay::CallNode*, int, )’ is ambiguous
const auto* conv_call = GetRootCall(callee->body.as(), 1, {"nn.conv2d", "nn.relu"});
^
In file included from /mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc:37:0:
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/../../utils.h:532:24: note: candidate: const tvm::relay::CallNode* tvm::relay::backend::GetRootCall(const tvm::relay::CallNode*, int, const std::vector<std::_cxx11::basic_string >&)
inline const CallNode* GetRootCall(const CallNode* current_call, int depth,
^~~~~~~~~~~
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/../../utils.h:581:24: note: candidate: const tvm::relay::CallNode* tvm::relay::backend::GetRootCall(const tvm::relay::CallNode*, int, const string&)
inline const CallNode* GetRootCall(const CallNode* current_call, int max_depth,
^~~~~~~~~~~
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc: In member function ‘virtual void tvm::relay::contrib::DNNLConstantUpdater::VisitExpr
(const tvm::relay::CallNode*)’:
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc:596:29: error: ‘BindToCallNodeArgs’ was not declared in this scope
auto args = root_cn ? BindToCallNodeArgs(args_loc, cn) : cn->args;
^~~~~~~~~~~~~~~~~~
/mnt/e/code/tvm/src/relay/backend/contrib/dnnl/codegen.cc:599:30: error: unable to deduce ‘auto&&’ from ‘args’
for (const auto& arg : args) {
^~~~
CMakeFiles/tvm_objs.dir/build.make:8307: recipe for target 'CMakeFiles/tvm_objs.dir/src/relay/backend/contrib/dnnl/codegen.cc.o' failed
make[2]: *** [CMakeFiles/tvm_objs.dir/src/relay/backend/contrib/dnnl/codegen.cc.o] Error 1
CMakeFiles/Makefile2:850: recipe for target 'CMakeFiles/tvm_objs.dir/all' failed
make[1]: *** [CMakeFiles/tvm_objs.dir/all] Error 2
Makefile:145: recipe for target 'all' failed
make: *** [all] Error 2

@yuwenjun1988 yuwenjun1988 added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Oct 28, 2022
@yangulei
Copy link
Contributor

The DNNL C codegen is just a POC, set(USE_DNNL JSON) or set(USE_DNNL ON) to use the JSON runtime instead, for more details please refer to https://github.com/apache/tvm-rfcs/blob/main/rfcs/0069-byoc-onednn-integration.md.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

No branches or pull requests

2 participants