Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] AlterOpLayout failed on Conv->Transpose->Conv #10109

Closed
lazycal opened this issue Jan 30, 2022 · 0 comments · Fixed by #10118
Closed

[Bug] AlterOpLayout failed on Conv->Transpose->Conv #10109

lazycal opened this issue Jan 30, 2022 · 0 comments · Fixed by #10118
Assignees

Comments

@lazycal
Copy link
Contributor

lazycal commented Jan 30, 2022

image

import tvm
from tvm import relay
import numpy as np


x = relay.var("x", shape=(1, 1, 24, 48))
w1 = relay.const(np.random.uniform(size=(1, 1, 1, 1)))
w2 = relay.const(np.random.uniform(size=(1, 1, 1, 1)))
y = relay.nn.conv2d(x, w1, kernel_size=(1, 1), padding=(0, 0), channels=1)
y = relay.transpose(y, (0, 1, 3, 2))
z = relay.nn.conv2d(y, w2, kernel_size=(1, 1), padding=(0, 0), channels=1)
func = relay.Function([x], z)
mod = tvm.IRModule.from_expr(func)
print(mod)
# with tvm.transform.PassContext(opt_level=3, disabled_pass=["AlterOpLayout"]):
with tvm.transform.PassContext(opt_level=3):
    res = relay.build_module.create_executor('graph', mod, target='llvm', device=tvm.cpu()).evaluate()(
        np.random.uniform(size=(1, 1, 24, 48)).astype(np.float32))
print(res)

The above model first does a conv, then a transpose that swap H and W dimension, and finally conv again. It fails with error

One or more operators have not been tuned. Please tune your model for better performance. Use DEBUG logging level to see more details.
The Relay type checker is unable to show the following types match:
  Tensor[(1, 1, 24, 48), float32]
  Tensor[(1, 1, 48, 24), float32]

The root cause is similar to https://discuss.tvm.apache.org/t/pytorch-layout-cannot-convert-f-linear-x-f-linear-y-z/10866. In short, during alteroplayout pass, each dimension is assumed to be associated with a specific semantic (e.g., H, W, O, I, ...), and when this assumption is broken, the pass will be fragile.

Environment

OS: ubuntu 1804
TVM: 6a274af

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants