Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] GetStoreRule failure at simple Conv2d + Squeeze model #10528

Open
ganler opened this issue Mar 8, 2022 · 2 comments
Open

[Bug] GetStoreRule failure at simple Conv2d + Squeeze model #10528

ganler opened this issue Mar 8, 2022 · 2 comments
Assignees
Labels
relay:op src/relay/op tir any TIR core issues which don’t fit into other tir: labels (include/tvm/tir, src/tir) type: bug

Comments

@ganler
Copy link
Contributor

ganler commented Mar 8, 2022

Thanks for participating in the TVM community! We use https://discuss.tvm.ai for any general usage questions and discussions. The issue tracker is used for actionable items such as feature proposals discussion, roadmaps, and bug tracking. You are always welcomed to post on the forum first 😸

Issues that are inactive for a period of time may get closed. We adopt this policy so that we won't lose track of actionable issues that may fall at the bottom of the pile. Feel free to reopen a new one if you feel there is an additional problem that needs attention when an old one gets closed.

Expected behavior

"""
def @main(%x: Tensor[(2, 2, 1, 1), float32]) -> Tensor[(2, 3, 4), float32] {
  %0 = nn.conv2d(%x, meta[relay.Constant][0] /* ty=Tensor[(1, 2, 3, 1), float32] */, strides=[2, 2], padding=[3, 3, 3, 3]) /* ty=Tensor[(2, 1, 3, 4), float32] */;
  squeeze(%0, axis=[1]) /* ty=Tensor[(2, 3, 4), float32] */
}
"""

This simple conv2d + squeeze model should pass compilation but actually failed.

Actual behavior

  5: tvm::relay::MixedModeMutator::VisitLeaf(tvm::RelayExpr const&)
        at /home/ganler/Documents/tvm/src/relay/ir/expr_functor.cc:81
  4: tvm::relay::TempRealizer::DispatchVisitExpr(tvm::RelayExpr const&)
        at /home/ganler/Documents/tvm/src/relay/transforms/forward_rewrite.cc:46
  3: tvm::relay::LayoutAlternatedExprNode<tvm::relay::alter_op_layout::AlterTransformMemorizer>::Realize() const
        at /home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:183
  2: tvm::relay::TransformMemorizer::Transform(tvm::RelayExpr, tvm::tir::Layout const&, tvm::tir::Layout const&)
        at /home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:115
  1: tvm::relay::TransformMemorizer::TransformHelper(tvm::RelayExpr, tvm::tir::Layout, tvm::tir::Layout)
        at /home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:157
  0: tvm::tir::BijectiveLayout::BijectiveLayout(tvm::tir::Layout, tvm::tir::Layout)
        at /home/ganler/Documents/tvm/src/tir/ir/data_layout.cc:421
  File "/home/ganler/Documents/tvm/src/tir/ir/data_layout.cc", line 422
TVMError: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (GetStoreRule(&n->index_backward_rule, &n->shape_backward_rule, n->dst_layout, n->src_layout)) is false: NHW1c  NHW

Environment

Linux; Clang-14; TVM tag 8f6fa8f;

Steps to reproduce

Preferably a minimal script to cause the issue to occur.

import tvm
from tvm import relay
import numpy as np
xshape = (2, 2, 1, 1)
inp = np.random.uniform(size=xshape).astype(np.int64)

x = relay.var("x", shape=xshape, dtype='float32')

v1 = relay.nn.conv2d(x, weight=relay.const(value=np.random.random((1, 2, 3, 1))), strides=[2, 2], padding=[3, 3, 3, 3], kernel_size=[3, 1], channels=1)
out = relay.squeeze(v1, axis=[1])

func = relay.Function([x], out)
mod = tvm.IRModule.from_expr(func)

with tvm.transform.PassContext(opt_level=4):
    relay.build_module.create_executor("graph", mod, tvm.cpu(), target='llvm').evaluate()

This seems to be related to #9996

cc: @masahi @yangulei

cc @Hzfengsy @junrushao

@masahi masahi self-assigned this Mar 8, 2022
@yangulei
Copy link
Contributor

yangulei commented Mar 8, 2022

It seems that we are trying to initialize a transform for NHW1c <--> NHW, which need to add/remove a primal axis, while layout_transform doesn't support this so far. In fact, I don't think we need layout transform here.

@ganler
Copy link
Contributor Author

ganler commented Mar 8, 2022

@yangulei Thanks for explaining this bug. Just curious about the workaround to fix it? Do we need to implement new logics to transform its layout or we can simply skip this optimization for such patterns during optimization dispatching.

@ganler ganler mentioned this issue Oct 13, 2022
@areusch areusch added the needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it label Oct 19, 2022
@driazati driazati added relay:op src/relay/op tir any TIR core issues which don’t fit into other tir: labels (include/tvm/tir, src/tir) and removed needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it labels Oct 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
relay:op src/relay/op tir any TIR core issues which don’t fit into other tir: labels (include/tvm/tir, src/tir) type: bug
Projects
None yet
Development

No branches or pull requests

5 participants